[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18911 1727096280.59600: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18911 1727096280.59913: Added group all to inventory 18911 1727096280.59914: Added group ungrouped to inventory 18911 1727096280.59917: Group all now contains ungrouped 18911 1727096280.59919: Examining possible inventory source: /tmp/network-EuO/inventory.yml 18911 1727096280.69968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18911 1727096280.70012: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18911 1727096280.70029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18911 1727096280.70072: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18911 1727096280.70122: Loaded config def from plugin (inventory/script) 18911 1727096280.70123: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18911 1727096280.70150: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18911 1727096280.70209: Loaded config def from plugin (inventory/yaml) 18911 1727096280.70210: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18911 1727096280.70275: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18911 1727096280.70553: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18911 1727096280.70562: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18911 1727096280.70565: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18911 1727096280.70575: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18911 1727096280.70580: Loading data from /tmp/network-EuO/inventory.yml 18911 1727096280.70626: /tmp/network-EuO/inventory.yml was not parsable by auto 18911 1727096280.70675: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18911 1727096280.70704: Loading data from /tmp/network-EuO/inventory.yml 18911 1727096280.70754: group all already in inventory 18911 1727096280.70759: set inventory_file for managed_node1 18911 1727096280.70764: set inventory_dir for managed_node1 18911 1727096280.70765: Added host managed_node1 to inventory 18911 1727096280.70769: Added host managed_node1 to group all 18911 1727096280.70770: set ansible_host for managed_node1 18911 1727096280.70771: set ansible_ssh_extra_args for managed_node1 18911 1727096280.70774: set inventory_file for managed_node2 18911 1727096280.70776: set inventory_dir for managed_node2 18911 1727096280.70777: Added host managed_node2 to inventory 18911 1727096280.70778: Added host managed_node2 to group all 18911 1727096280.70778: set ansible_host for managed_node2 18911 1727096280.70779: set ansible_ssh_extra_args for managed_node2 18911 1727096280.70780: set inventory_file for managed_node3 18911 1727096280.70782: set inventory_dir for managed_node3 18911 1727096280.70782: Added host managed_node3 to inventory 18911 1727096280.70783: Added host managed_node3 to group all 18911 1727096280.70783: set ansible_host for managed_node3 18911 1727096280.70784: set ansible_ssh_extra_args for managed_node3 18911 1727096280.70785: Reconcile groups and hosts in inventory. 18911 1727096280.70788: Group ungrouped now contains managed_node1 18911 1727096280.70789: Group ungrouped now contains managed_node2 18911 1727096280.70790: Group ungrouped now contains managed_node3 18911 1727096280.70840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18911 1727096280.70925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18911 1727096280.70953: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18911 1727096280.70975: Loaded config def from plugin (vars/host_group_vars) 18911 1727096280.70976: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18911 1727096280.70981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18911 1727096280.70987: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18911 1727096280.71017: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18911 1727096280.71251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096280.71324: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18911 1727096280.71347: Loaded config def from plugin (connection/local) 18911 1727096280.71349: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18911 1727096280.71733: Loaded config def from plugin (connection/paramiko_ssh) 18911 1727096280.71735: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18911 1727096280.72294: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18911 1727096280.72320: Loaded config def from plugin (connection/psrp) 18911 1727096280.72322: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18911 1727096280.72748: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18911 1727096280.72773: Loaded config def from plugin (connection/ssh) 18911 1727096280.72776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18911 1727096280.74059: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18911 1727096280.74085: Loaded config def from plugin (connection/winrm) 18911 1727096280.74087: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18911 1727096280.74109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18911 1727096280.74154: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18911 1727096280.74195: Loaded config def from plugin (shell/cmd) 18911 1727096280.74197: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18911 1727096280.74214: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18911 1727096280.74250: Loaded config def from plugin (shell/powershell) 18911 1727096280.74251: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18911 1727096280.74292: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18911 1727096280.74393: Loaded config def from plugin (shell/sh) 18911 1727096280.74395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18911 1727096280.74417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18911 1727096280.74491: Loaded config def from plugin (become/runas) 18911 1727096280.74493: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18911 1727096280.74600: Loaded config def from plugin (become/su) 18911 1727096280.74602: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18911 1727096280.74695: Loaded config def from plugin (become/sudo) 18911 1727096280.74697: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18911 1727096280.74720: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18911 1727096280.74940: in VariableManager get_vars() 18911 1727096280.74955: done with get_vars() 18911 1727096280.75046: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18911 1727096280.76997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18911 1727096280.77072: in VariableManager get_vars() 18911 1727096280.77075: done with get_vars() 18911 1727096280.77077: variable 'playbook_dir' from source: magic vars 18911 1727096280.77078: variable 'ansible_playbook_python' from source: magic vars 18911 1727096280.77078: variable 'ansible_config_file' from source: magic vars 18911 1727096280.77079: variable 'groups' from source: magic vars 18911 1727096280.77079: variable 'omit' from source: magic vars 18911 1727096280.77080: variable 'ansible_version' from source: magic vars 18911 1727096280.77080: variable 'ansible_check_mode' from source: magic vars 18911 1727096280.77081: variable 'ansible_diff_mode' from source: magic vars 18911 1727096280.77081: variable 'ansible_forks' from source: magic vars 18911 1727096280.77081: variable 'ansible_inventory_sources' from source: magic vars 18911 1727096280.77082: variable 'ansible_skip_tags' from source: magic vars 18911 1727096280.77082: variable 'ansible_limit' from source: magic vars 18911 1727096280.77083: variable 'ansible_run_tags' from source: magic vars 18911 1727096280.77083: variable 'ansible_verbosity' from source: magic vars 18911 1727096280.77109: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18911 1727096280.77540: in VariableManager get_vars() 18911 1727096280.77551: done with get_vars() 18911 1727096280.77577: in VariableManager get_vars() 18911 1727096280.77591: done with get_vars() 18911 1727096280.77612: in VariableManager get_vars() 18911 1727096280.77620: done with get_vars() 18911 1727096280.77637: in VariableManager get_vars() 18911 1727096280.77646: done with get_vars() 18911 1727096280.77698: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18911 1727096280.77818: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18911 1727096280.77909: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18911 1727096280.78284: in VariableManager get_vars() 18911 1727096280.78300: done with get_vars() 18911 1727096280.78598: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18911 1727096280.78701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096280.80008: in VariableManager get_vars() 18911 1727096280.80075: done with get_vars() 18911 1727096280.80316: in VariableManager get_vars() 18911 1727096280.80327: done with get_vars() 18911 1727096280.80330: variable 'playbook_dir' from source: magic vars 18911 1727096280.80334: variable 'ansible_playbook_python' from source: magic vars 18911 1727096280.80335: variable 'ansible_config_file' from source: magic vars 18911 1727096280.80335: variable 'groups' from source: magic vars 18911 1727096280.80336: variable 'omit' from source: magic vars 18911 1727096280.80337: variable 'ansible_version' from source: magic vars 18911 1727096280.80338: variable 'ansible_check_mode' from source: magic vars 18911 1727096280.80338: variable 'ansible_diff_mode' from source: magic vars 18911 1727096280.80339: variable 'ansible_forks' from source: magic vars 18911 1727096280.80340: variable 'ansible_inventory_sources' from source: magic vars 18911 1727096280.80341: variable 'ansible_skip_tags' from source: magic vars 18911 1727096280.80342: variable 'ansible_limit' from source: magic vars 18911 1727096280.80343: variable 'ansible_run_tags' from source: magic vars 18911 1727096280.80344: variable 'ansible_verbosity' from source: magic vars 18911 1727096280.80383: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18911 1727096280.80471: in VariableManager get_vars() 18911 1727096280.80475: done with get_vars() 18911 1727096280.80477: variable 'playbook_dir' from source: magic vars 18911 1727096280.80478: variable 'ansible_playbook_python' from source: magic vars 18911 1727096280.80479: variable 'ansible_config_file' from source: magic vars 18911 1727096280.80480: variable 'groups' from source: magic vars 18911 1727096280.80481: variable 'omit' from source: magic vars 18911 1727096280.80481: variable 'ansible_version' from source: magic vars 18911 1727096280.80482: variable 'ansible_check_mode' from source: magic vars 18911 1727096280.80483: variable 'ansible_diff_mode' from source: magic vars 18911 1727096280.80483: variable 'ansible_forks' from source: magic vars 18911 1727096280.80484: variable 'ansible_inventory_sources' from source: magic vars 18911 1727096280.80485: variable 'ansible_skip_tags' from source: magic vars 18911 1727096280.80486: variable 'ansible_limit' from source: magic vars 18911 1727096280.80486: variable 'ansible_run_tags' from source: magic vars 18911 1727096280.80487: variable 'ansible_verbosity' from source: magic vars 18911 1727096280.80507: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18911 1727096280.80623: in VariableManager get_vars() 18911 1727096280.80643: done with get_vars() 18911 1727096280.80721: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18911 1727096280.80795: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18911 1727096280.80863: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18911 1727096280.81297: in VariableManager get_vars() 18911 1727096280.81310: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096280.82319: in VariableManager get_vars() 18911 1727096280.82337: done with get_vars() 18911 1727096280.82361: in VariableManager get_vars() 18911 1727096280.82363: done with get_vars() 18911 1727096280.82365: variable 'playbook_dir' from source: magic vars 18911 1727096280.82365: variable 'ansible_playbook_python' from source: magic vars 18911 1727096280.82366: variable 'ansible_config_file' from source: magic vars 18911 1727096280.82366: variable 'groups' from source: magic vars 18911 1727096280.82366: variable 'omit' from source: magic vars 18911 1727096280.82369: variable 'ansible_version' from source: magic vars 18911 1727096280.82369: variable 'ansible_check_mode' from source: magic vars 18911 1727096280.82370: variable 'ansible_diff_mode' from source: magic vars 18911 1727096280.82370: variable 'ansible_forks' from source: magic vars 18911 1727096280.82370: variable 'ansible_inventory_sources' from source: magic vars 18911 1727096280.82371: variable 'ansible_skip_tags' from source: magic vars 18911 1727096280.82371: variable 'ansible_limit' from source: magic vars 18911 1727096280.82372: variable 'ansible_run_tags' from source: magic vars 18911 1727096280.82372: variable 'ansible_verbosity' from source: magic vars 18911 1727096280.82392: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18911 1727096280.82438: in VariableManager get_vars() 18911 1727096280.82445: done with get_vars() 18911 1727096280.82474: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18911 1727096280.84208: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18911 1727096280.84292: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18911 1727096280.84674: in VariableManager get_vars() 18911 1727096280.84694: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096280.86259: in VariableManager get_vars() 18911 1727096280.86276: done with get_vars() 18911 1727096280.86312: in VariableManager get_vars() 18911 1727096280.86324: done with get_vars() 18911 1727096280.86387: in VariableManager get_vars() 18911 1727096280.86400: done with get_vars() 18911 1727096280.86494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18911 1727096280.86509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18911 1727096280.86738: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18911 1727096280.86899: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18911 1727096280.86902: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 18911 1727096280.86933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18911 1727096280.86959: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18911 1727096280.87126: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18911 1727096280.87188: Loaded config def from plugin (callback/default) 18911 1727096280.87191: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096280.88297: Loaded config def from plugin (callback/junit) 18911 1727096280.88300: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096280.88344: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18911 1727096280.88409: Loaded config def from plugin (callback/minimal) 18911 1727096280.88412: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096280.88454: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096280.88515: Loaded config def from plugin (callback/tree) 18911 1727096280.88518: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18911 1727096280.88643: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18911 1727096280.88646: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18911 1727096280.88675: in VariableManager get_vars() 18911 1727096280.88690: done with get_vars() 18911 1727096280.88696: in VariableManager get_vars() 18911 1727096280.88705: done with get_vars() 18911 1727096280.88715: variable 'omit' from source: magic vars 18911 1727096280.88753: in VariableManager get_vars() 18911 1727096280.88767: done with get_vars() 18911 1727096280.88790: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 18911 1727096280.89320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18911 1727096280.89393: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18911 1727096280.89424: getting the remaining hosts for this loop 18911 1727096280.89426: done getting the remaining hosts for this loop 18911 1727096280.89429: getting the next task for host managed_node1 18911 1727096280.89433: done getting next task for host managed_node1 18911 1727096280.89434: ^ task is: TASK: Gathering Facts 18911 1727096280.89437: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096280.89445: getting variables 18911 1727096280.89446: in VariableManager get_vars() 18911 1727096280.89456: Calling all_inventory to load vars for managed_node1 18911 1727096280.89459: Calling groups_inventory to load vars for managed_node1 18911 1727096280.89461: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096280.89476: Calling all_plugins_play to load vars for managed_node1 18911 1727096280.89487: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096280.89491: Calling groups_plugins_play to load vars for managed_node1 18911 1727096280.89542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096280.89664: done with get_vars() 18911 1727096280.89673: done getting variables 18911 1727096280.89735: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Monday 23 September 2024 08:58:00 -0400 (0:00:00.011) 0:00:00.011 ****** 18911 1727096280.89756: entering _queue_task() for managed_node1/gather_facts 18911 1727096280.89758: Creating lock for gather_facts 18911 1727096280.90133: worker is 1 (out of 1 available) 18911 1727096280.90144: exiting _queue_task() for managed_node1/gather_facts 18911 1727096280.90157: done queuing things up, now waiting for results queue to drain 18911 1727096280.90158: waiting for pending results... 18911 1727096280.90396: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096280.90457: in run() - task 0afff68d-5257-09a7-aae1-00000000007c 18911 1727096280.90484: variable 'ansible_search_path' from source: unknown 18911 1727096280.90574: calling self._execute() 18911 1727096280.90607: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096280.90621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096280.90640: variable 'omit' from source: magic vars 18911 1727096280.90760: variable 'omit' from source: magic vars 18911 1727096280.90799: variable 'omit' from source: magic vars 18911 1727096280.90873: variable 'omit' from source: magic vars 18911 1727096280.90903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096280.90955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096280.91046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096280.91049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096280.91052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096280.91062: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096280.91078: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096280.91087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096280.91206: Set connection var ansible_shell_executable to /bin/sh 18911 1727096280.91227: Set connection var ansible_timeout to 10 18911 1727096280.91235: Set connection var ansible_shell_type to sh 18911 1727096280.91410: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096280.91414: Set connection var ansible_pipelining to False 18911 1727096280.91416: Set connection var ansible_connection to ssh 18911 1727096280.91418: variable 'ansible_shell_executable' from source: unknown 18911 1727096280.91420: variable 'ansible_connection' from source: unknown 18911 1727096280.91423: variable 'ansible_module_compression' from source: unknown 18911 1727096280.91425: variable 'ansible_shell_type' from source: unknown 18911 1727096280.91427: variable 'ansible_shell_executable' from source: unknown 18911 1727096280.91430: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096280.91432: variable 'ansible_pipelining' from source: unknown 18911 1727096280.91434: variable 'ansible_timeout' from source: unknown 18911 1727096280.91437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096280.91979: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096280.91983: variable 'omit' from source: magic vars 18911 1727096280.91986: starting attempt loop 18911 1727096280.91988: running the handler 18911 1727096280.91991: variable 'ansible_facts' from source: unknown 18911 1727096280.91994: _low_level_execute_command(): starting 18911 1727096280.92109: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096280.93535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096280.93620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096280.93770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096280.93812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096280.93876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096280.94198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096280.95923: stdout chunk (state=3): >>>/root <<< 18911 1727096280.96069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096280.96085: stdout chunk (state=3): >>><<< 18911 1727096280.96099: stderr chunk (state=3): >>><<< 18911 1727096280.96130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096280.96150: _low_level_execute_command(): starting 18911 1727096280.96172: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136 `" && echo ansible-tmp-1727096280.961373-18934-118319348086136="` echo /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136 `" ) && sleep 0' 18911 1727096280.96881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096280.96903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096280.96913: stderr chunk (state=3): >>>debug2: match found <<< 18911 1727096280.96997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096280.97043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096280.97118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096280.99522: stdout chunk (state=3): >>>ansible-tmp-1727096280.961373-18934-118319348086136=/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136 <<< 18911 1727096280.99526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096280.99528: stdout chunk (state=3): >>><<< 18911 1727096280.99530: stderr chunk (state=3): >>><<< 18911 1727096280.99532: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096280.961373-18934-118319348086136=/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096280.99534: variable 'ansible_module_compression' from source: unknown 18911 1727096280.99734: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18911 1727096280.99738: ANSIBALLZ: Acquiring lock 18911 1727096280.99741: ANSIBALLZ: Lock acquired: 140481135532592 18911 1727096280.99743: ANSIBALLZ: Creating module 18911 1727096281.38041: ANSIBALLZ: Writing module into payload 18911 1727096281.38201: ANSIBALLZ: Writing module 18911 1727096281.38234: ANSIBALLZ: Renaming module 18911 1727096281.38246: ANSIBALLZ: Done creating module 18911 1727096281.38275: variable 'ansible_facts' from source: unknown 18911 1727096281.38287: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096281.38300: _low_level_execute_command(): starting 18911 1727096281.38310: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18911 1727096281.39086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096281.39102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096281.39117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096281.39224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096281.40925: stdout chunk (state=3): >>>PLATFORM <<< 18911 1727096281.40992: stdout chunk (state=3): >>>Linux <<< 18911 1727096281.41021: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18911 1727096281.41228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096281.41232: stdout chunk (state=3): >>><<< 18911 1727096281.41234: stderr chunk (state=3): >>><<< 18911 1727096281.41372: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096281.41378 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18911 1727096281.41383: _low_level_execute_command(): starting 18911 1727096281.41385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18911 1727096281.41525: Sending initial data 18911 1727096281.41529: Sent initial data (1181 bytes) 18911 1727096281.41986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096281.42002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096281.42060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096281.42131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096281.42164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096281.42181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096281.42288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096281.45783: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18911 1727096281.46192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096281.46473: stderr chunk (state=3): >>><<< 18911 1727096281.46476: stdout chunk (state=3): >>><<< 18911 1727096281.46479: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096281.46482: variable 'ansible_facts' from source: unknown 18911 1727096281.46484: variable 'ansible_facts' from source: unknown 18911 1727096281.46486: variable 'ansible_module_compression' from source: unknown 18911 1727096281.46488: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096281.46491: variable 'ansible_facts' from source: unknown 18911 1727096281.46619: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py 18911 1727096281.46853: Sending initial data 18911 1727096281.46856: Sent initial data (153 bytes) 18911 1727096281.47438: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096281.47454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096281.47494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096281.47514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096281.47607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096281.47621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096281.47645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096281.47743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096281.49385: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096281.49482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096281.49554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpa79f96kl /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py <<< 18911 1727096281.49577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py" <<< 18911 1727096281.49607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpa79f96kl" to remote "/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py" <<< 18911 1727096281.51304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096281.51336: stderr chunk (state=3): >>><<< 18911 1727096281.51455: stdout chunk (state=3): >>><<< 18911 1727096281.51459: done transferring module to remote 18911 1727096281.51461: _low_level_execute_command(): starting 18911 1727096281.51464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/ /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py && sleep 0' 18911 1727096281.52084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096281.52132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096281.52147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096281.52165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096281.52340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096281.54225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096281.54230: stdout chunk (state=3): >>><<< 18911 1727096281.54233: stderr chunk (state=3): >>><<< 18911 1727096281.54351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096281.54355: _low_level_execute_command(): starting 18911 1727096281.54357: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/AnsiballZ_setup.py && sleep 0' 18911 1727096281.54908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096281.54916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096281.54928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096281.54942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096281.54954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096281.54973: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096281.54976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096281.54990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096281.55121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096281.55124: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18911 1727096281.55126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096281.55128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096281.55130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096281.55132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096281.55134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096281.55136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096281.55173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096281.55295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096281.57644: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18911 1727096281.57652: stdout chunk (state=3): >>>import _imp # builtin <<< 18911 1727096281.57690: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 18911 1727096281.57746: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 18911 1727096281.57786: stdout chunk (state=3): >>>import 'posix' # <<< 18911 1727096281.57858: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 18911 1727096281.57861: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 18911 1727096281.57864: stdout chunk (state=3): >>># installed zipimport hook <<< 18911 1727096281.57905: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 18911 1727096281.57924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 18911 1727096281.57946: stdout chunk (state=3): >>>import 'codecs' # <<< 18911 1727096281.57980: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18911 1727096281.58020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91be7b30> <<< 18911 1727096281.58048: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91c1aa50> <<< 18911 1727096281.58090: stdout chunk (state=3): >>>import '_signal' # <<< 18911 1727096281.58117: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 18911 1727096281.58132: stdout chunk (state=3): >>>import 'io' # <<< 18911 1727096281.58160: stdout chunk (state=3): >>>import '_stat' # <<< 18911 1727096281.58174: stdout chunk (state=3): >>>import 'stat' # <<< 18911 1727096281.58248: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18911 1727096281.58273: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 18911 1727096281.58313: stdout chunk (state=3): >>>import 'os' # <<< 18911 1727096281.58333: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 18911 1727096281.58371: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 18911 1727096281.58393: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18911 1727096281.58418: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a2d130> <<< 18911 1727096281.58474: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.58507: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a2dfa0> <<< 18911 1727096281.58524: stdout chunk (state=3): >>>import 'site' # <<< 18911 1727096281.58543: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18911 1727096281.58934: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18911 1727096281.58938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18911 1727096281.58964: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18911 1727096281.58991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18911 1727096281.59025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18911 1727096281.59048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18911 1727096281.59075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18911 1727096281.59095: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a6bdd0> <<< 18911 1727096281.59120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18911 1727096281.59145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18911 1727096281.59161: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a6bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18911 1727096281.59193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18911 1727096281.59212: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18911 1727096281.59280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.59315: stdout chunk (state=3): >>>import 'itertools' # <<< 18911 1727096281.59344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91aa37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91aa3e30> <<< 18911 1727096281.59369: stdout chunk (state=3): >>>import '_collections' # <<< 18911 1727096281.59430: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a83aa0> <<< 18911 1727096281.59437: stdout chunk (state=3): >>>import '_functools' # <<< 18911 1727096281.59457: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a811c0> <<< 18911 1727096281.59552: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a68f80> <<< 18911 1727096281.59567: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18911 1727096281.59596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 18911 1727096281.59624: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18911 1727096281.59656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 18911 1727096281.59681: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18911 1727096281.59716: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac3710> <<< 18911 1727096281.59747: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac2330> <<< 18911 1727096281.59765: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac0b90> <<< 18911 1727096281.59829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18911 1727096281.59832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a68200> <<< 18911 1727096281.59886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18911 1727096281.59889: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91af8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af8aa0> <<< 18911 1727096281.59931: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91af8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a66d20> <<< 18911 1727096281.59972: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.59996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18911 1727096281.60031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af9250> <<< 18911 1727096281.60049: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 18911 1727096281.60093: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 18911 1727096281.60112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afa480> import 'importlib.util' # <<< 18911 1727096281.60136: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18911 1727096281.60177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18911 1727096281.60206: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b10680> <<< 18911 1727096281.60260: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.60265: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b11d60> <<< 18911 1727096281.60312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 18911 1727096281.60336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 18911 1727096281.60341: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b12c00> <<< 18911 1727096281.60371: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b12150> <<< 18911 1727096281.60400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 18911 1727096281.60445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18911 1727096281.60449: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.60474: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b13ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b13410> <<< 18911 1727096281.60501: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afa4b0> <<< 18911 1727096281.60536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18911 1727096281.60551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18911 1727096281.60581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18911 1727096281.60596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18911 1727096281.60649: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9184bbc0> <<< 18911 1727096281.60654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18911 1727096281.60690: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a918746b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91874410> <<< 18911 1727096281.60708: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a918746e0> <<< 18911 1727096281.60733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 18911 1727096281.60749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18911 1727096281.60816: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.60942: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91875010> <<< 18911 1727096281.61084: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.61103: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91875a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918748c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91849d60> <<< 18911 1727096281.61124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18911 1727096281.61148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18911 1727096281.61203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 18911 1727096281.61220: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91876d80> <<< 18911 1727096281.61223: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91875880> <<< 18911 1727096281.61255: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afaba0> <<< 18911 1727096281.61258: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18911 1727096281.61312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.61338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18911 1727096281.61361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18911 1727096281.61395: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9189f110> <<< 18911 1727096281.61465: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18911 1727096281.61469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.61497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18911 1727096281.61510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18911 1727096281.61536: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918c3440> <<< 18911 1727096281.61560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18911 1727096281.61616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18911 1727096281.61658: stdout chunk (state=3): >>>import 'ntpath' # <<< 18911 1727096281.61691: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a919241a0> <<< 18911 1727096281.61704: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18911 1727096281.61746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18911 1727096281.61758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18911 1727096281.61799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18911 1727096281.61880: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91926900> <<< 18911 1727096281.61960: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a919242c0> <<< 18911 1727096281.61991: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918f1220> <<< 18911 1727096281.62035: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 18911 1727096281.62050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9172d250> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918c2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91877ce0> <<< 18911 1727096281.62226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18911 1727096281.62244: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5a9172d4f0> <<< 18911 1727096281.62508: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_poat8byj/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 18911 1727096281.62637: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.62656: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18911 1727096281.62683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18911 1727096281.62716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18911 1727096281.62821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18911 1727096281.62841: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178efc0> <<< 18911 1727096281.62850: stdout chunk (state=3): >>>import '_typing' # <<< 18911 1727096281.63045: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9176deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9176d010> <<< 18911 1727096281.63075: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.63093: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 18911 1727096281.63123: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.63144: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 18911 1727096281.63158: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.64589: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.65808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178ce60> <<< 18911 1727096281.65814: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.65872: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18911 1727096281.65885: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18911 1727096281.65911: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c2900> <<< 18911 1727096281.65945: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c2690> <<< 18911 1727096281.65980: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c1fa0> <<< 18911 1727096281.66008: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18911 1727096281.66054: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c23f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178f9e0> <<< 18911 1727096281.66094: stdout chunk (state=3): >>>import 'atexit' # <<< 18911 1727096281.66101: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c36b0> <<< 18911 1727096281.66125: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c38f0> <<< 18911 1727096281.66146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18911 1727096281.66207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 18911 1727096281.66270: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c3e30> <<< 18911 1727096281.66300: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18911 1727096281.66317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18911 1727096281.66357: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9112daf0> <<< 18911 1727096281.66397: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.66430: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9112f2c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 18911 1727096281.66447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18911 1727096281.66465: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91130140> <<< 18911 1727096281.66493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18911 1727096281.66531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18911 1727096281.66554: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911312e0> <<< 18911 1727096281.66569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18911 1727096281.66942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91133dd0> <<< 18911 1727096281.66968: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b12b70> <<< 18911 1727096281.66973: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91132090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 18911 1727096281.66993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18911 1727096281.67007: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113bdd0> import '_tokenize' # <<< 18911 1727096281.67095: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113a630> <<< 18911 1727096281.67110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18911 1727096281.67182: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113ab70> <<< 18911 1727096281.67235: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911325a0> <<< 18911 1727096281.67238: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.67272: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9117fa10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91180140> <<< 18911 1727096281.67311: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 18911 1727096281.67338: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18911 1727096281.67358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18911 1727096281.67393: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91181be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911819a0> <<< 18911 1727096281.67423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18911 1727096281.67453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18911 1727096281.67518: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91184140> <<< 18911 1727096281.67522: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911822d0> <<< 18911 1727096281.67527: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18911 1727096281.67579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.67613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18911 1727096281.67616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 18911 1727096281.67630: stdout chunk (state=3): >>>import '_string' # <<< 18911 1727096281.67735: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91187830> <<< 18911 1727096281.67793: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91184230> <<< 18911 1727096281.67861: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a911885f0> <<< 18911 1727096281.67891: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91188830> <<< 18911 1727096281.67939: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91188950> <<< 18911 1727096281.67981: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18911 1727096281.68006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18911 1727096281.68030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18911 1727096281.68052: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.68088: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a910141d0> <<< 18911 1727096281.68248: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91015370> <<< 18911 1727096281.68300: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9118a960> <<< 18911 1727096281.68305: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9118bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9118a5a0> <<< 18911 1727096281.68353: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18911 1727096281.68359: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.68444: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.68555: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.68563: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18911 1727096281.68597: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18911 1727096281.68726: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.68844: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.69399: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.69947: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18911 1727096281.70000: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18911 1727096281.70003: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.70056: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a910195e0> <<< 18911 1727096281.70142: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18911 1727096281.70170: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101a390> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91015580> <<< 18911 1727096281.70237: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 18911 1727096281.70271: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.70286: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18911 1727096281.70433: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.70600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18911 1727096281.70621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101a450> <<< 18911 1727096281.70633: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71087: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71537: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71607: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71693: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18911 1727096281.71696: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71726: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71772: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18911 1727096281.71782: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71844: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.71928: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18911 1727096281.71964: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18911 1727096281.71985: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72024: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72059: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18911 1727096281.72076: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72301: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18911 1727096281.72614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18911 1727096281.72618: stdout chunk (state=3): >>>import '_ast' # <<< 18911 1727096281.72695: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101b590> # zipimport: zlib available <<< 18911 1727096281.72772: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72847: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18911 1727096281.72881: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 18911 1727096281.72899: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72927: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.72971: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 18911 1727096281.72982: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73022: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73073: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73125: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73194: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18911 1727096281.73232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.73330: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91026240> <<< 18911 1727096281.73375: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91020e00> <<< 18911 1727096281.73407: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18911 1727096281.73426: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73481: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73544: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73577: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.73645: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.73675: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18911 1727096281.73698: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18911 1727096281.73747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18911 1727096281.73772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18911 1727096281.73779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18911 1727096281.73845: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9110ebd0> <<< 18911 1727096281.74211: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911fe8a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910263c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910155e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18911 1727096281.74263: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74352: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74377: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74404: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74466: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74524: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74576: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 18911 1727096281.74639: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74755: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.74891: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.74946: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 18911 1727096281.74951: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.75239: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.75561: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.75642: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 18911 1727096281.75647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096281.75670: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18911 1727096281.75692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 18911 1727096281.75713: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 18911 1727096281.75753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 18911 1727096281.75784: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910ba2a0> <<< 18911 1727096281.75849: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 18911 1727096281.75905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18911 1727096281.75933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 18911 1727096281.75959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c781d0> <<< 18911 1727096281.76016: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c78530> <<< 18911 1727096281.76111: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910a0170> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910bade0> <<< 18911 1727096281.76146: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b8950> <<< 18911 1727096281.76180: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b92b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18911 1727096281.76248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 18911 1727096281.76280: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 18911 1727096281.76318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18911 1727096281.76350: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c7b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7acf0> <<< 18911 1727096281.76409: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c7ae70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7a120> <<< 18911 1727096281.76425: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18911 1727096281.76610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 18911 1727096281.76650: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7b470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18911 1727096281.76692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 18911 1727096281.76774: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90cd9f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7bf50> <<< 18911 1727096281.76849: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b8d40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 18911 1727096281.76872: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.76895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 18911 1727096281.76908: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.76958: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 18911 1727096281.77048: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77091: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 18911 1727096281.77177: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 18911 1727096281.77216: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77246: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 18911 1727096281.77256: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77306: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 18911 1727096281.77415: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 18911 1727096281.77544: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77585: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77635: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.77707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 18911 1727096281.77715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 18911 1727096281.77720: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.78525: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18911 1727096281.79356: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79450: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79536: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79571: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79614: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 18911 1727096281.79623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 18911 1727096281.79634: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79717: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 18911 1727096281.79725: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79815: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 18911 1727096281.79935: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.79994: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 18911 1727096281.80010: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.80043: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.80089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 18911 1727096281.80101: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.80278: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.80358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 18911 1727096281.80365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18911 1727096281.80398: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90cdb530> <<< 18911 1727096281.80434: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 18911 1727096281.80471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18911 1727096281.80712: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90cdab70> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 18911 1727096281.80917: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # <<< 18911 1727096281.80932: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81211: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 18911 1727096281.81215: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81282: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 18911 1727096281.81311: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81329: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18911 1727096281.81501: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.81565: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90d16180> <<< 18911 1727096281.81766: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90d044d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 18911 1727096281.81827: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 18911 1727096281.81896: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.81982: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.82062: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.82183: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.82373: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 18911 1727096281.82391: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.82427: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 18911 1727096281.82447: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.82499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18911 1727096281.82634: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096281.82639: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90d29d30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90d29970> import 'ansible.module_utils.facts.system.user' # <<< 18911 1727096281.82657: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.82691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 18911 1727096281.82875: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 18911 1727096281.83123: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83215: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83277: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.83422: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83492: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 18911 1727096281.83688: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.83769: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.84051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.84530: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 18911 1727096281.85063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 18911 1727096281.85155: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85263: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18911 1727096281.85282: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85371: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18911 1727096281.85490: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85629: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 18911 1727096281.85823: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 18911 1727096281.85873: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.85915: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18911 1727096281.85927: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86060: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86272: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86419: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 18911 1727096281.86561: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 18911 1727096281.86637: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86662: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 18911 1727096281.86755: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 18911 1727096281.86863: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.86922: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 18911 1727096281.86935: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 18911 1727096281.87179: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 18911 1727096281.87373: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 18911 1727096281.87724: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 18911 1727096281.87840: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87850: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.87933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 18911 1727096281.87980: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 18911 1727096281.87983: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 18911 1727096281.88073: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88148: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 18911 1727096281.88258: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 18911 1727096281.88290: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.88319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 18911 1727096281.88374: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88407: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.88488: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096281.88550: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 18911 1727096281.88661: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88728: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.88793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18911 1727096281.89042: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 18911 1727096281.89215: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 18911 1727096281.89377: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 18911 1727096281.89453: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89539: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 18911 1727096281.89555: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89642: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.89742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18911 1727096281.89833: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096281.90783: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18911 1727096281.90906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90b2b4d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b29a60> <<< 18911 1727096281.90917: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b2aa20> <<< 18911 1727096282.00974: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 18911 1727096282.01041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b72ae0><<< 18911 1727096282.01082: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py<<< 18911 1727096282.01094: stdout chunk (state=3): >>> <<< 18911 1727096282.01175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b710a0><<< 18911 1727096282.01183: stdout chunk (state=3): >>> <<< 18911 1727096282.01254: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 18911 1727096282.01328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096282.01334: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 18911 1727096282.01381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b72930> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b71f70> <<< 18911 1727096282.01789: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18911 1727096282.28678: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 435, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795434496, "block_size": 4096, "block_total": 65519099, "block_available": 63914901, "block_used": 1604198, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "02", "epoch": "1727096282", "epoch_int": "1727096282", "date": "2024-09-23", "time": "08:58:02", "iso8601_micro": "2024-09-23T12:58:02.275354Z", "iso8601": "2024-09-23T12:58:02Z", "iso8601_basic": "20240923T085802275354", "iso8601_basic_short": "20240923T085802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096282.29050: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 18911 1727096282.29072: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 18911 1727096282.29123: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 18911 1727096282.29178: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 18911 1727096282.29216: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os<<< 18911 1727096282.29617: stdout chunk (state=3): >>> # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 18911 1727096282.29646: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 18911 1727096282.29649: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 18911 1727096282.29712: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__<<< 18911 1727096282.29716: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util<<< 18911 1727096282.29741: stdout chunk (state=3): >>> # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor<<< 18911 1727096282.29766: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips<<< 18911 1727096282.29794: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version<<< 18911 1727096282.29811: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl<<< 18911 1727096282.29892: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system<<< 18911 1727096282.29933: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys<<< 18911 1727096282.30285: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18911 1727096282.30678: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18911 1727096282.30783: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2<<< 18911 1727096282.30811: stdout chunk (state=3): >>> # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii<<< 18911 1727096282.30827: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 18911 1727096282.30859: stdout chunk (state=3): >>> # destroy zipfile <<< 18911 1727096282.31201: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 18911 1727096282.31221: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 18911 1727096282.31245: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 18911 1727096282.31285: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 18911 1727096282.31310: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 18911 1727096282.31342: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18911 1727096282.31385: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 18911 1727096282.31550: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob <<< 18911 1727096282.31619: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 18911 1727096282.31642: stdout chunk (state=3): >>># destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 18911 1727096282.31697: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 18911 1727096282.31729: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 18911 1727096282.31899: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18911 1727096282.32041: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18911 1727096282.32087: stdout chunk (state=3): >>># destroy _collections # destroy platform <<< 18911 1727096282.32120: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath <<< 18911 1727096282.32154: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 18911 1727096282.32403: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 18911 1727096282.32406: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 18911 1727096282.32444: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 18911 1727096282.32466: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 18911 1727096282.32593: stdout chunk (state=3): >>># clear sys.audit hooks <<< 18911 1727096282.33230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096282.33233: stdout chunk (state=3): >>><<< 18911 1727096282.33242: stderr chunk (state=3): >>><<< 18911 1727096282.33787: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a6bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a6bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91aa37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91aa3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a83aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a811c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a68f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91ac0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a68200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91af8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91af8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91a66d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91af9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b10680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b11d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b12c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b12150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b13ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91b13410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9184bbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a918746b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91874410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a918746e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91875010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91875a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918748c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91849d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91876d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91875880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91afaba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9189f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918c3440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a919241a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91926900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a919242c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918f1220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9172d250> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a918c2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91877ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5a9172d4f0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_poat8byj/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178efc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9176deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9176d010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178ce60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c2900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c2690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c1fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c23f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9178f9e0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c36b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a917c38f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a917c3e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9112daf0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9112f2c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91130140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911312e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91133dd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91b12b70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91132090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113a630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9113ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911325a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9117fa10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91180140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91181be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911819a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91184140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911822d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91187830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91184230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a911885f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91188830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91188950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a910141d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91015370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9118a960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a9118bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9118a5a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a910195e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101a390> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91015580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101a450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9101b590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a91026240> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a91020e00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a9110ebd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a911fe8a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910263c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910155e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910ba2a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c781d0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c78530> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910a0170> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910bade0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b8950> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b92b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c7b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90c7ae70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7b470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90cd9f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90c7bf50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a910b8d40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90cdb530> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90cdab70> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90d16180> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90d044d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90d29d30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90d29970> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a90b2b4d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b29a60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b2aa20> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b72ae0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b710a0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b72930> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a90b71f70> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 435, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795434496, "block_size": 4096, "block_total": 65519099, "block_available": 63914901, "block_used": 1604198, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "02", "epoch": "1727096282", "epoch_int": "1727096282", "date": "2024-09-23", "time": "08:58:02", "iso8601_micro": "2024-09-23T12:58:02.275354Z", "iso8601": "2024-09-23T12:58:02Z", "iso8601_basic": "20240923T085802275354", "iso8601_basic_short": "20240923T085802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18911 1727096282.38792: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096282.38796: _low_level_execute_command(): starting 18911 1727096282.38798: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096280.961373-18934-118319348086136/ > /dev/null 2>&1 && sleep 0' 18911 1727096282.40004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096282.40078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096282.40090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096282.40284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096282.42653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096282.42657: stderr chunk (state=3): >>><<< 18911 1727096282.42660: stdout chunk (state=3): >>><<< 18911 1727096282.42873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096282.42877: handler run complete 18911 1727096282.42880: variable 'ansible_facts' from source: unknown 18911 1727096282.42907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.43226: variable 'ansible_facts' from source: unknown 18911 1727096282.43313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.43436: attempt loop complete, returning result 18911 1727096282.43440: _execute() done 18911 1727096282.43443: dumping result to json 18911 1727096282.43676: done dumping result, returning 18911 1727096282.43685: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-00000000007c] 18911 1727096282.43688: sending task result for task 0afff68d-5257-09a7-aae1-00000000007c 18911 1727096282.44776: done sending task result for task 0afff68d-5257-09a7-aae1-00000000007c 18911 1727096282.44779: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096282.45349: no more pending results, returning what we have 18911 1727096282.45352: results queue empty 18911 1727096282.45353: checking for any_errors_fatal 18911 1727096282.45354: done checking for any_errors_fatal 18911 1727096282.45355: checking for max_fail_percentage 18911 1727096282.45356: done checking for max_fail_percentage 18911 1727096282.45357: checking to see if all hosts have failed and the running result is not ok 18911 1727096282.45357: done checking to see if all hosts have failed 18911 1727096282.45358: getting the remaining hosts for this loop 18911 1727096282.45359: done getting the remaining hosts for this loop 18911 1727096282.45363: getting the next task for host managed_node1 18911 1727096282.45370: done getting next task for host managed_node1 18911 1727096282.45371: ^ task is: TASK: meta (flush_handlers) 18911 1727096282.45373: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096282.45454: getting variables 18911 1727096282.45456: in VariableManager get_vars() 18911 1727096282.45480: Calling all_inventory to load vars for managed_node1 18911 1727096282.45483: Calling groups_inventory to load vars for managed_node1 18911 1727096282.45493: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.45502: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.45505: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.45508: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.45942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.46363: done with get_vars() 18911 1727096282.46376: done getting variables 18911 1727096282.46523: in VariableManager get_vars() 18911 1727096282.46532: Calling all_inventory to load vars for managed_node1 18911 1727096282.46535: Calling groups_inventory to load vars for managed_node1 18911 1727096282.46581: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.46587: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.46590: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.46593: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.46921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.47378: done with get_vars() 18911 1727096282.47393: done queuing things up, now waiting for results queue to drain 18911 1727096282.47395: results queue empty 18911 1727096282.47396: checking for any_errors_fatal 18911 1727096282.47399: done checking for any_errors_fatal 18911 1727096282.47400: checking for max_fail_percentage 18911 1727096282.47401: done checking for max_fail_percentage 18911 1727096282.47401: checking to see if all hosts have failed and the running result is not ok 18911 1727096282.47402: done checking to see if all hosts have failed 18911 1727096282.47480: getting the remaining hosts for this loop 18911 1727096282.47482: done getting the remaining hosts for this loop 18911 1727096282.47485: getting the next task for host managed_node1 18911 1727096282.47491: done getting next task for host managed_node1 18911 1727096282.47494: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18911 1727096282.47496: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096282.47498: getting variables 18911 1727096282.47499: in VariableManager get_vars() 18911 1727096282.47508: Calling all_inventory to load vars for managed_node1 18911 1727096282.47510: Calling groups_inventory to load vars for managed_node1 18911 1727096282.47572: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.47578: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.47581: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.47584: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.48304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.48637: done with get_vars() 18911 1727096282.48647: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Monday 23 September 2024 08:58:02 -0400 (0:00:01.590) 0:00:01.602 ****** 18911 1727096282.48786: entering _queue_task() for managed_node1/include_tasks 18911 1727096282.48788: Creating lock for include_tasks 18911 1727096282.49647: worker is 1 (out of 1 available) 18911 1727096282.49659: exiting _queue_task() for managed_node1/include_tasks 18911 1727096282.49672: done queuing things up, now waiting for results queue to drain 18911 1727096282.49675: waiting for pending results... 18911 1727096282.50480: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 18911 1727096282.50485: in run() - task 0afff68d-5257-09a7-aae1-000000000006 18911 1727096282.50488: variable 'ansible_search_path' from source: unknown 18911 1727096282.50491: calling self._execute() 18911 1727096282.50789: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096282.50794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096282.50797: variable 'omit' from source: magic vars 18911 1727096282.51002: _execute() done 18911 1727096282.51006: dumping result to json 18911 1727096282.51009: done dumping result, returning 18911 1727096282.51012: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-09a7-aae1-000000000006] 18911 1727096282.51029: sending task result for task 0afff68d-5257-09a7-aae1-000000000006 18911 1727096282.51387: no more pending results, returning what we have 18911 1727096282.51393: in VariableManager get_vars() 18911 1727096282.51431: Calling all_inventory to load vars for managed_node1 18911 1727096282.51435: Calling groups_inventory to load vars for managed_node1 18911 1727096282.51439: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.51456: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.51460: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.51463: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.51868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.52450: done sending task result for task 0afff68d-5257-09a7-aae1-000000000006 18911 1727096282.52453: WORKER PROCESS EXITING 18911 1727096282.52540: done with get_vars() 18911 1727096282.52548: variable 'ansible_search_path' from source: unknown 18911 1727096282.52633: we have included files to process 18911 1727096282.52634: generating all_blocks data 18911 1727096282.52635: done generating all_blocks data 18911 1727096282.52637: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18911 1727096282.52638: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18911 1727096282.52641: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18911 1727096282.54088: in VariableManager get_vars() 18911 1727096282.54108: done with get_vars() 18911 1727096282.54120: done processing included file 18911 1727096282.54122: iterating over new_blocks loaded from include file 18911 1727096282.54124: in VariableManager get_vars() 18911 1727096282.54138: done with get_vars() 18911 1727096282.54140: filtering new block on tags 18911 1727096282.54155: done filtering new block on tags 18911 1727096282.54159: in VariableManager get_vars() 18911 1727096282.54181: done with get_vars() 18911 1727096282.54183: filtering new block on tags 18911 1727096282.54199: done filtering new block on tags 18911 1727096282.54202: in VariableManager get_vars() 18911 1727096282.54213: done with get_vars() 18911 1727096282.54215: filtering new block on tags 18911 1727096282.54227: done filtering new block on tags 18911 1727096282.54229: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 18911 1727096282.54235: extending task lists for all hosts with included blocks 18911 1727096282.54291: done extending task lists 18911 1727096282.54292: done processing included files 18911 1727096282.54293: results queue empty 18911 1727096282.54294: checking for any_errors_fatal 18911 1727096282.54295: done checking for any_errors_fatal 18911 1727096282.54296: checking for max_fail_percentage 18911 1727096282.54297: done checking for max_fail_percentage 18911 1727096282.54298: checking to see if all hosts have failed and the running result is not ok 18911 1727096282.54298: done checking to see if all hosts have failed 18911 1727096282.54299: getting the remaining hosts for this loop 18911 1727096282.54300: done getting the remaining hosts for this loop 18911 1727096282.54302: getting the next task for host managed_node1 18911 1727096282.54306: done getting next task for host managed_node1 18911 1727096282.54308: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18911 1727096282.54311: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096282.54313: getting variables 18911 1727096282.54314: in VariableManager get_vars() 18911 1727096282.54323: Calling all_inventory to load vars for managed_node1 18911 1727096282.54325: Calling groups_inventory to load vars for managed_node1 18911 1727096282.54327: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.54333: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.54335: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.54338: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.54520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.54718: done with get_vars() 18911 1727096282.54727: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:58:02 -0400 (0:00:00.060) 0:00:01.662 ****** 18911 1727096282.54800: entering _queue_task() for managed_node1/setup 18911 1727096282.55247: worker is 1 (out of 1 available) 18911 1727096282.55257: exiting _queue_task() for managed_node1/setup 18911 1727096282.55269: done queuing things up, now waiting for results queue to drain 18911 1727096282.55270: waiting for pending results... 18911 1727096282.55424: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 18911 1727096282.55526: in run() - task 0afff68d-5257-09a7-aae1-00000000008d 18911 1727096282.55541: variable 'ansible_search_path' from source: unknown 18911 1727096282.55551: variable 'ansible_search_path' from source: unknown 18911 1727096282.55593: calling self._execute() 18911 1727096282.55669: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096282.55681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096282.55695: variable 'omit' from source: magic vars 18911 1727096282.56952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096282.61725: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096282.61860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096282.62178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096282.62182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096282.62184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096282.62187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096282.62408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096282.62438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096282.62482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096282.62519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096282.62879: variable 'ansible_facts' from source: unknown 18911 1727096282.63113: variable 'network_test_required_facts' from source: task vars 18911 1727096282.63264: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 18911 1727096282.63270: when evaluation is False, skipping this task 18911 1727096282.63273: _execute() done 18911 1727096282.63275: dumping result to json 18911 1727096282.63278: done dumping result, returning 18911 1727096282.63280: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-09a7-aae1-00000000008d] 18911 1727096282.63282: sending task result for task 0afff68d-5257-09a7-aae1-00000000008d skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 18911 1727096282.63593: no more pending results, returning what we have 18911 1727096282.63596: results queue empty 18911 1727096282.63597: checking for any_errors_fatal 18911 1727096282.63599: done checking for any_errors_fatal 18911 1727096282.63599: checking for max_fail_percentage 18911 1727096282.63601: done checking for max_fail_percentage 18911 1727096282.63602: checking to see if all hosts have failed and the running result is not ok 18911 1727096282.63603: done checking to see if all hosts have failed 18911 1727096282.63604: getting the remaining hosts for this loop 18911 1727096282.63605: done getting the remaining hosts for this loop 18911 1727096282.63609: getting the next task for host managed_node1 18911 1727096282.63619: done getting next task for host managed_node1 18911 1727096282.63623: ^ task is: TASK: Check if system is ostree 18911 1727096282.63626: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096282.63629: getting variables 18911 1727096282.63631: in VariableManager get_vars() 18911 1727096282.63660: Calling all_inventory to load vars for managed_node1 18911 1727096282.63663: Calling groups_inventory to load vars for managed_node1 18911 1727096282.63670: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096282.63682: Calling all_plugins_play to load vars for managed_node1 18911 1727096282.63686: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096282.63689: Calling groups_plugins_play to load vars for managed_node1 18911 1727096282.64117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096282.64501: done with get_vars() 18911 1727096282.64511: done getting variables 18911 1727096282.64589: done sending task result for task 0afff68d-5257-09a7-aae1-00000000008d 18911 1727096282.64592: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:58:02 -0400 (0:00:00.099) 0:00:01.762 ****** 18911 1727096282.64775: entering _queue_task() for managed_node1/stat 18911 1727096282.65281: worker is 1 (out of 1 available) 18911 1727096282.65383: exiting _queue_task() for managed_node1/stat 18911 1727096282.65397: done queuing things up, now waiting for results queue to drain 18911 1727096282.65399: waiting for pending results... 18911 1727096282.65711: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 18911 1727096282.65916: in run() - task 0afff68d-5257-09a7-aae1-00000000008f 18911 1727096282.65934: variable 'ansible_search_path' from source: unknown 18911 1727096282.66014: variable 'ansible_search_path' from source: unknown 18911 1727096282.66054: calling self._execute() 18911 1727096282.66246: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096282.66259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096282.66281: variable 'omit' from source: magic vars 18911 1727096282.67250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096282.67637: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096282.67876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096282.67879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096282.68078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096282.68188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096282.68224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096282.68256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096282.68291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096282.68575: Evaluated conditional (not __network_is_ostree is defined): True 18911 1727096282.68642: variable 'omit' from source: magic vars 18911 1727096282.68690: variable 'omit' from source: magic vars 18911 1727096282.68785: variable 'omit' from source: magic vars 18911 1727096282.68875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096282.68938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096282.68982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096282.69004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096282.69017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096282.69051: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096282.69065: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096282.69077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096282.69190: Set connection var ansible_shell_executable to /bin/sh 18911 1727096282.69201: Set connection var ansible_timeout to 10 18911 1727096282.69208: Set connection var ansible_shell_type to sh 18911 1727096282.69220: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096282.69289: Set connection var ansible_pipelining to False 18911 1727096282.69293: Set connection var ansible_connection to ssh 18911 1727096282.69295: variable 'ansible_shell_executable' from source: unknown 18911 1727096282.69297: variable 'ansible_connection' from source: unknown 18911 1727096282.69300: variable 'ansible_module_compression' from source: unknown 18911 1727096282.69301: variable 'ansible_shell_type' from source: unknown 18911 1727096282.69303: variable 'ansible_shell_executable' from source: unknown 18911 1727096282.69305: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096282.69307: variable 'ansible_pipelining' from source: unknown 18911 1727096282.69309: variable 'ansible_timeout' from source: unknown 18911 1727096282.69311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096282.69460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096282.69481: variable 'omit' from source: magic vars 18911 1727096282.69491: starting attempt loop 18911 1727096282.69498: running the handler 18911 1727096282.69518: _low_level_execute_command(): starting 18911 1727096282.69530: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096282.70284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096282.70289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096282.70386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096282.70399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096282.70422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096282.70435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096282.70538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096282.72509: stdout chunk (state=3): >>>/root <<< 18911 1727096282.72924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096282.72930: stdout chunk (state=3): >>><<< 18911 1727096282.72933: stderr chunk (state=3): >>><<< 18911 1727096282.72938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096282.72948: _low_level_execute_command(): starting 18911 1727096282.72951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023 `" && echo ansible-tmp-1727096282.7281702-18993-235029925633023="` echo /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023 `" ) && sleep 0' 18911 1727096282.73590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096282.73595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096282.73646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096282.73676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096282.73718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096282.73791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096282.76588: stdout chunk (state=3): >>>ansible-tmp-1727096282.7281702-18993-235029925633023=/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023 <<< 18911 1727096282.76622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096282.76626: stdout chunk (state=3): >>><<< 18911 1727096282.76628: stderr chunk (state=3): >>><<< 18911 1727096282.76651: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096282.7281702-18993-235029925633023=/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096282.76784: variable 'ansible_module_compression' from source: unknown 18911 1727096282.76797: ANSIBALLZ: Using lock for stat 18911 1727096282.76805: ANSIBALLZ: Acquiring lock 18911 1727096282.76813: ANSIBALLZ: Lock acquired: 140481135533840 18911 1727096282.76820: ANSIBALLZ: Creating module 18911 1727096282.90171: ANSIBALLZ: Writing module into payload 18911 1727096282.90273: ANSIBALLZ: Writing module 18911 1727096282.90295: ANSIBALLZ: Renaming module 18911 1727096282.90372: ANSIBALLZ: Done creating module 18911 1727096282.90375: variable 'ansible_facts' from source: unknown 18911 1727096282.90401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py 18911 1727096282.90594: Sending initial data 18911 1727096282.90597: Sent initial data (153 bytes) 18911 1727096282.91198: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096282.91207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096282.91235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096282.91365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096282.91370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096282.91373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096282.91376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096282.91468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096282.93916: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096282.94036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096282.94126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp21my1_yz /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py <<< 18911 1727096282.94139: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py" <<< 18911 1727096282.94214: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp21my1_yz" to remote "/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py" <<< 18911 1727096282.95402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096282.95406: stdout chunk (state=3): >>><<< 18911 1727096282.95410: stderr chunk (state=3): >>><<< 18911 1727096282.95413: done transferring module to remote 18911 1727096282.95415: _low_level_execute_command(): starting 18911 1727096282.95421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/ /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py && sleep 0' 18911 1727096282.96380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096282.96384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096282.96386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096282.96388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096282.96505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096282.99113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096282.99165: stderr chunk (state=3): >>><<< 18911 1727096282.99171: stdout chunk (state=3): >>><<< 18911 1727096282.99203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096282.99206: _low_level_execute_command(): starting 18911 1727096282.99209: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/AnsiballZ_stat.py && sleep 0' 18911 1727096282.99794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096282.99800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096282.99810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096282.99824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096282.99924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096282.99984: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096282.99988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.00030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.00042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.00186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.00346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096283.02657: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18911 1727096283.02671: stdout chunk (state=3): >>>import _imp # builtin <<< 18911 1727096283.02798: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 18911 1727096283.02836: stdout chunk (state=3): >>>import 'posix' # <<< 18911 1727096283.02881: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 18911 1727096283.02934: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.02952: stdout chunk (state=3): >>>import '_codecs' # <<< 18911 1727096283.03229: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 18911 1727096283.03323: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18911 1727096283.03349: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18911 1727096283.03352: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 18911 1727096283.03386: stdout chunk (state=3): >>>import 'os' # <<< 18911 1727096283.03405: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 18911 1727096283.03439: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 18911 1727096283.03472: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18911 1727096283.03526: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a2d130> <<< 18911 1727096283.03582: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.03604: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a2dfa0> <<< 18911 1727096283.03660: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18911 1727096283.04017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18911 1727096283.04069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18911 1727096283.04099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18911 1727096283.04192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18911 1727096283.04246: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a6be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18911 1727096283.04258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18911 1727096283.04303: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18911 1727096283.04327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18911 1727096283.04423: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.04490: stdout chunk (state=3): >>>import 'itertools' # <<< 18911 1727096283.04494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1aa3830> <<< 18911 1727096283.04554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1aa3ec0> <<< 18911 1727096283.04603: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a83b60> <<< 18911 1727096283.04606: stdout chunk (state=3): >>>import '_functools' # <<< 18911 1727096283.04875: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a81280> <<< 18911 1727096283.04908: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18911 1727096283.04955: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac37d0> <<< 18911 1727096283.04972: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac23f0> <<< 18911 1727096283.05007: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac0c20> <<< 18911 1727096283.05178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1af8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096283.05248: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1af8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a66de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 18911 1727096283.05333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af9610> <<< 18911 1727096283.05374: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af92e0> import 'importlib.machinery' # <<< 18911 1727096283.05413: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18911 1727096283.05459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18911 1727096283.05600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b10710> import 'errno' # <<< 18911 1727096283.05603: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b11df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 18911 1727096283.05615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 18911 1727096283.05774: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b12c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096283.05781: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b121e0> <<< 18911 1727096283.05813: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afa540> <<< 18911 1727096283.05825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18911 1727096283.05860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18911 1727096283.05894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18911 1727096283.05995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18911 1727096283.06027: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1897bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c06e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c0440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c0710> <<< 18911 1727096283.06134: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18911 1727096283.06147: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096283.06326: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c1040> <<< 18911 1727096283.06547: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c19a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c08f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1895d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18911 1727096283.06577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18911 1727096283.06693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c2db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c1af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18911 1727096283.06793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.06804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18911 1727096283.06824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18911 1727096283.06972: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18ef110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18911 1727096283.07009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e190f4a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18911 1727096283.07048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18911 1727096283.07143: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1970260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18911 1727096283.07172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18911 1727096283.07194: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18911 1727096283.07242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18911 1727096283.07353: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e19729c0> <<< 18911 1727096283.07610: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1970380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1939280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1775340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e190e2a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c3ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18911 1727096283.07613: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f11e17755b0> <<< 18911 1727096283.07776: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_vlmjb8lb/ansible_stat_payload.zip' # zipimport: zlib available <<< 18911 1727096283.07911: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.07946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18911 1727096283.07988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18911 1727096283.08071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18911 1727096283.08150: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17cb0b0> import '_typing' # <<< 18911 1727096283.08291: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17a9fa0> <<< 18911 1727096283.08371: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17a9130> # zipimport: zlib available import 'ansible' # <<< 18911 1727096283.08403: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 18911 1727096283.10554: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.12287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17c8f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f2990> <<< 18911 1727096283.12385: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2720> <<< 18911 1727096283.12388: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2030> <<< 18911 1727096283.12798: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2a80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17cbd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f36e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f3920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f3e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1115a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1117710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18911 1727096283.12843: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1118110> <<< 18911 1727096283.12846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18911 1727096283.12898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18911 1727096283.12904: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11192b0> <<< 18911 1727096283.12918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18911 1727096283.12958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18911 1727096283.12977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18911 1727096283.13384: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e111bd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e19701d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1119ee0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1123a10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1122510> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1122270> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18911 1727096283.13432: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11227b0> <<< 18911 1727096283.13470: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e111a450> <<< 18911 1727096283.13488: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116bcb0> <<< 18911 1727096283.13510: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116bce0> <<< 18911 1727096283.13892: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116d850> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116d610> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116fd70> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116df10> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.13917: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18911 1727096283.13920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18911 1727096283.13963: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1173590> <<< 18911 1727096283.14383: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116ff20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174380> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116b6e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1177fe0> <<< 18911 1727096283.14523: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18911 1727096283.14526: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1001640> <<< 18911 1727096283.14587: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11767e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1177b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1176420> # zipimport: zlib available <<< 18911 1727096283.14659: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18911 1727096283.14697: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.14790: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.14821: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 18911 1727096283.14871: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18911 1727096283.14976: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.15085: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.15645: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.16205: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18911 1727096283.16234: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.16299: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1005880> <<< 18911 1727096283.16376: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 18911 1727096283.16403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1006690> <<< 18911 1727096283.16498: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1001490> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 18911 1727096283.16523: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.16653: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.16812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18911 1727096283.16839: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e10066f0> # zipimport: zlib available <<< 18911 1727096283.17288: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.17732: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.17799: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.17877: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18911 1727096283.17949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18911 1727096283.17979: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 18911 1727096283.18133: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.18156: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 18911 1727096283.18197: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.18233: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18911 1727096283.18254: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.18480: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.18702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18911 1727096283.18775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18911 1727096283.18791: stdout chunk (state=3): >>>import '_ast' # <<< 18911 1727096283.18852: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1007950> <<< 18911 1727096283.18866: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.18924: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19063: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18911 1727096283.19080: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 18911 1727096283.19158: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18911 1727096283.19237: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19254: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19277: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19347: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18911 1727096283.19378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.19565: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1012360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e100edb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18911 1727096283.19611: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19680: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19697: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.19750: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18911 1727096283.19899: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18911 1727096283.19902: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18911 1727096283.19915: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18911 1727096283.19973: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1102d20> <<< 18911 1727096283.20010: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e182e9f0> <<< 18911 1727096283.20259: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1012570> <<< 18911 1727096283.20265: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1177320> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 18911 1727096283.20280: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18911 1727096283.20405: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.20597: stdout chunk (state=3): >>># zipimport: zlib available <<< 18911 1727096283.20755: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 18911 1727096283.20774: stdout chunk (state=3): >>># destroy __main__ <<< 18911 1727096283.21128: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 18911 1727096283.21298: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile <<< 18911 1727096283.21386: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform <<< 18911 1727096283.21394: stdout chunk (state=3): >>># cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 18911 1727096283.21534: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro<<< 18911 1727096283.21538: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18911 1727096283.21776: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18911 1727096283.22014: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 18911 1727096283.22020: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 18911 1727096283.22059: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 18911 1727096283.22136: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 18911 1727096283.22141: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 18911 1727096283.22223: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18911 1727096283.22438: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 18911 1727096283.22442: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18911 1727096283.22473: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 18911 1727096283.22510: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18911 1727096283.22529: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18911 1727096283.22632: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 18911 1727096283.22665: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 18911 1727096283.22722: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 18911 1727096283.22739: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 18911 1727096283.22792: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18911 1727096283.23162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096283.23166: stdout chunk (state=3): >>><<< 18911 1727096283.23321: stderr chunk (state=3): >>><<< 18911 1727096283.23335: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a6be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1aa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1aa3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a81280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1ac0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1af8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1af8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1a66de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1af92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b10710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b11df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b12c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1b13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1b134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1897bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c06e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c0440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c0710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c1040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e18c19a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c08f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1895d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c2db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c1af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1afac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18ef110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e190f4a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1970260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e19729c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1970380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1939280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1775340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e190e2a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e18c3ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f11e17755b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_vlmjb8lb/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17cb0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17a9fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17a9130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17c8f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f2a80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17cbd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f36e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e17f3920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e17f3e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1115a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1117710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1118110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11192b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e111bd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e19701d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1119ee0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1123a10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1122510> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1122270> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11227b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e111a450> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116bcb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116bce0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116d850> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116d610> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e116fd70> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116df10> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1173590> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116ff20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174380> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1174950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e116b6e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1177fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1001640> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e11767e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1177b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1176420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1005880> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1006690> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1001490> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e10066f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1007950> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11e1012360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e100edb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1102d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e182e9f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1012570> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11e1177320> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18911 1727096283.24891: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096283.24894: _low_level_execute_command(): starting 18911 1727096283.24897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096282.7281702-18993-235029925633023/ > /dev/null 2>&1 && sleep 0' 18911 1727096283.25143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096283.25249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096283.25252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096283.25255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096283.25257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096283.25259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.25373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.25455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.25532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096283.27616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096283.27621: stderr chunk (state=3): >>><<< 18911 1727096283.27623: stdout chunk (state=3): >>><<< 18911 1727096283.27646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096283.27677: handler run complete 18911 1727096283.27681: attempt loop complete, returning result 18911 1727096283.27683: _execute() done 18911 1727096283.27685: dumping result to json 18911 1727096283.27687: done dumping result, returning 18911 1727096283.27695: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0afff68d-5257-09a7-aae1-00000000008f] 18911 1727096283.27850: sending task result for task 0afff68d-5257-09a7-aae1-00000000008f 18911 1727096283.28030: done sending task result for task 0afff68d-5257-09a7-aae1-00000000008f 18911 1727096283.28033: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18911 1727096283.28122: no more pending results, returning what we have 18911 1727096283.28124: results queue empty 18911 1727096283.28125: checking for any_errors_fatal 18911 1727096283.28130: done checking for any_errors_fatal 18911 1727096283.28131: checking for max_fail_percentage 18911 1727096283.28133: done checking for max_fail_percentage 18911 1727096283.28133: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.28134: done checking to see if all hosts have failed 18911 1727096283.28135: getting the remaining hosts for this loop 18911 1727096283.28136: done getting the remaining hosts for this loop 18911 1727096283.28139: getting the next task for host managed_node1 18911 1727096283.28145: done getting next task for host managed_node1 18911 1727096283.28147: ^ task is: TASK: Set flag to indicate system is ostree 18911 1727096283.28150: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.28153: getting variables 18911 1727096283.28154: in VariableManager get_vars() 18911 1727096283.28188: Calling all_inventory to load vars for managed_node1 18911 1727096283.28191: Calling groups_inventory to load vars for managed_node1 18911 1727096283.28194: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.28205: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.28208: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.28211: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.28622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.28940: done with get_vars() 18911 1727096283.28951: done getting variables 18911 1727096283.29128: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:58:03 -0400 (0:00:00.643) 0:00:02.405 ****** 18911 1727096283.29157: entering _queue_task() for managed_node1/set_fact 18911 1727096283.29159: Creating lock for set_fact 18911 1727096283.29466: worker is 1 (out of 1 available) 18911 1727096283.29481: exiting _queue_task() for managed_node1/set_fact 18911 1727096283.29493: done queuing things up, now waiting for results queue to drain 18911 1727096283.29495: waiting for pending results... 18911 1727096283.29966: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 18911 1727096283.30172: in run() - task 0afff68d-5257-09a7-aae1-000000000090 18911 1727096283.30184: variable 'ansible_search_path' from source: unknown 18911 1727096283.30187: variable 'ansible_search_path' from source: unknown 18911 1727096283.30273: calling self._execute() 18911 1727096283.30415: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.30419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.30433: variable 'omit' from source: magic vars 18911 1727096283.31492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096283.32113: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096283.32229: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096283.32470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096283.32474: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096283.32578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096283.32614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096283.32674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096283.32783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096283.33024: Evaluated conditional (not __network_is_ostree is defined): True 18911 1727096283.33035: variable 'omit' from source: magic vars 18911 1727096283.33378: variable 'omit' from source: magic vars 18911 1727096283.33409: variable '__ostree_booted_stat' from source: set_fact 18911 1727096283.33464: variable 'omit' from source: magic vars 18911 1727096283.33622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096283.33656: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096283.33680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096283.33812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.33816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.33834: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096283.33848: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.33858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.34089: Set connection var ansible_shell_executable to /bin/sh 18911 1727096283.34169: Set connection var ansible_timeout to 10 18911 1727096283.34172: Set connection var ansible_shell_type to sh 18911 1727096283.34175: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096283.34177: Set connection var ansible_pipelining to False 18911 1727096283.34188: Set connection var ansible_connection to ssh 18911 1727096283.34274: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.34277: variable 'ansible_connection' from source: unknown 18911 1727096283.34280: variable 'ansible_module_compression' from source: unknown 18911 1727096283.34282: variable 'ansible_shell_type' from source: unknown 18911 1727096283.34284: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.34286: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.34288: variable 'ansible_pipelining' from source: unknown 18911 1727096283.34290: variable 'ansible_timeout' from source: unknown 18911 1727096283.34292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.34579: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096283.34604: variable 'omit' from source: magic vars 18911 1727096283.34615: starting attempt loop 18911 1727096283.34622: running the handler 18911 1727096283.34638: handler run complete 18911 1727096283.34651: attempt loop complete, returning result 18911 1727096283.34658: _execute() done 18911 1727096283.34666: dumping result to json 18911 1727096283.34696: done dumping result, returning 18911 1727096283.34902: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0afff68d-5257-09a7-aae1-000000000090] 18911 1727096283.34906: sending task result for task 0afff68d-5257-09a7-aae1-000000000090 18911 1727096283.34972: done sending task result for task 0afff68d-5257-09a7-aae1-000000000090 18911 1727096283.34975: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18911 1727096283.35028: no more pending results, returning what we have 18911 1727096283.35032: results queue empty 18911 1727096283.35033: checking for any_errors_fatal 18911 1727096283.35039: done checking for any_errors_fatal 18911 1727096283.35039: checking for max_fail_percentage 18911 1727096283.35041: done checking for max_fail_percentage 18911 1727096283.35042: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.35043: done checking to see if all hosts have failed 18911 1727096283.35043: getting the remaining hosts for this loop 18911 1727096283.35045: done getting the remaining hosts for this loop 18911 1727096283.35048: getting the next task for host managed_node1 18911 1727096283.35057: done getting next task for host managed_node1 18911 1727096283.35060: ^ task is: TASK: Fix CentOS6 Base repo 18911 1727096283.35062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.35066: getting variables 18911 1727096283.35070: in VariableManager get_vars() 18911 1727096283.35103: Calling all_inventory to load vars for managed_node1 18911 1727096283.35106: Calling groups_inventory to load vars for managed_node1 18911 1727096283.35110: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.35122: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.35126: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.35135: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.35912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.36392: done with get_vars() 18911 1727096283.36403: done getting variables 18911 1727096283.36619: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:58:03 -0400 (0:00:00.074) 0:00:02.480 ****** 18911 1727096283.36647: entering _queue_task() for managed_node1/copy 18911 1727096283.37674: worker is 1 (out of 1 available) 18911 1727096283.37683: exiting _queue_task() for managed_node1/copy 18911 1727096283.37692: done queuing things up, now waiting for results queue to drain 18911 1727096283.37693: waiting for pending results... 18911 1727096283.38056: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 18911 1727096283.38061: in run() - task 0afff68d-5257-09a7-aae1-000000000092 18911 1727096283.38063: variable 'ansible_search_path' from source: unknown 18911 1727096283.38065: variable 'ansible_search_path' from source: unknown 18911 1727096283.38187: calling self._execute() 18911 1727096283.38322: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.38372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.38581: variable 'omit' from source: magic vars 18911 1727096283.39511: variable 'ansible_distribution' from source: facts 18911 1727096283.39590: Evaluated conditional (ansible_distribution == 'CentOS'): True 18911 1727096283.39863: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.39949: Evaluated conditional (ansible_distribution_major_version == '6'): False 18911 1727096283.39958: when evaluation is False, skipping this task 18911 1727096283.39979: _execute() done 18911 1727096283.40001: dumping result to json 18911 1727096283.40148: done dumping result, returning 18911 1727096283.40152: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0afff68d-5257-09a7-aae1-000000000092] 18911 1727096283.40154: sending task result for task 0afff68d-5257-09a7-aae1-000000000092 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18911 1727096283.40420: no more pending results, returning what we have 18911 1727096283.40512: results queue empty 18911 1727096283.40514: checking for any_errors_fatal 18911 1727096283.40519: done checking for any_errors_fatal 18911 1727096283.40519: checking for max_fail_percentage 18911 1727096283.40521: done checking for max_fail_percentage 18911 1727096283.40522: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.40523: done checking to see if all hosts have failed 18911 1727096283.40524: getting the remaining hosts for this loop 18911 1727096283.40525: done getting the remaining hosts for this loop 18911 1727096283.40528: getting the next task for host managed_node1 18911 1727096283.40651: done getting next task for host managed_node1 18911 1727096283.40654: ^ task is: TASK: Include the task 'enable_epel.yml' 18911 1727096283.40657: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.40662: getting variables 18911 1727096283.40664: in VariableManager get_vars() 18911 1727096283.40700: Calling all_inventory to load vars for managed_node1 18911 1727096283.40703: Calling groups_inventory to load vars for managed_node1 18911 1727096283.40707: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.40720: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.40723: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.40726: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.41543: done sending task result for task 0afff68d-5257-09a7-aae1-000000000092 18911 1727096283.41546: WORKER PROCESS EXITING 18911 1727096283.41711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.42554: done with get_vars() 18911 1727096283.42566: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:58:03 -0400 (0:00:00.063) 0:00:02.544 ****** 18911 1727096283.42971: entering _queue_task() for managed_node1/include_tasks 18911 1727096283.44365: worker is 1 (out of 1 available) 18911 1727096283.44380: exiting _queue_task() for managed_node1/include_tasks 18911 1727096283.44393: done queuing things up, now waiting for results queue to drain 18911 1727096283.44394: waiting for pending results... 18911 1727096283.44704: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 18911 1727096283.44830: in run() - task 0afff68d-5257-09a7-aae1-000000000093 18911 1727096283.45074: variable 'ansible_search_path' from source: unknown 18911 1727096283.45079: variable 'ansible_search_path' from source: unknown 18911 1727096283.45083: calling self._execute() 18911 1727096283.45274: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.45278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.45281: variable 'omit' from source: magic vars 18911 1727096283.46576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096283.51172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096283.51294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096283.51474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096283.51653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096283.51656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096283.51703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096283.51745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096283.51784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096283.51836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096283.51858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096283.51998: variable '__network_is_ostree' from source: set_fact 18911 1727096283.52022: Evaluated conditional (not __network_is_ostree | d(false)): True 18911 1727096283.52052: _execute() done 18911 1727096283.52070: dumping result to json 18911 1727096283.52079: done dumping result, returning 18911 1727096283.52091: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-09a7-aae1-000000000093] 18911 1727096283.52099: sending task result for task 0afff68d-5257-09a7-aae1-000000000093 18911 1727096283.52499: no more pending results, returning what we have 18911 1727096283.52504: in VariableManager get_vars() 18911 1727096283.52541: Calling all_inventory to load vars for managed_node1 18911 1727096283.52544: Calling groups_inventory to load vars for managed_node1 18911 1727096283.52548: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.52564: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.52570: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.52574: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.53416: done sending task result for task 0afff68d-5257-09a7-aae1-000000000093 18911 1727096283.53420: WORKER PROCESS EXITING 18911 1727096283.53433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.53636: done with get_vars() 18911 1727096283.53644: variable 'ansible_search_path' from source: unknown 18911 1727096283.53645: variable 'ansible_search_path' from source: unknown 18911 1727096283.53687: we have included files to process 18911 1727096283.53688: generating all_blocks data 18911 1727096283.53689: done generating all_blocks data 18911 1727096283.53694: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18911 1727096283.53695: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18911 1727096283.53698: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18911 1727096283.54429: done processing included file 18911 1727096283.54432: iterating over new_blocks loaded from include file 18911 1727096283.54434: in VariableManager get_vars() 18911 1727096283.54447: done with get_vars() 18911 1727096283.54448: filtering new block on tags 18911 1727096283.54476: done filtering new block on tags 18911 1727096283.54479: in VariableManager get_vars() 18911 1727096283.54491: done with get_vars() 18911 1727096283.54492: filtering new block on tags 18911 1727096283.54508: done filtering new block on tags 18911 1727096283.54510: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 18911 1727096283.54516: extending task lists for all hosts with included blocks 18911 1727096283.54621: done extending task lists 18911 1727096283.54623: done processing included files 18911 1727096283.54624: results queue empty 18911 1727096283.54624: checking for any_errors_fatal 18911 1727096283.54627: done checking for any_errors_fatal 18911 1727096283.54628: checking for max_fail_percentage 18911 1727096283.54629: done checking for max_fail_percentage 18911 1727096283.54629: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.54630: done checking to see if all hosts have failed 18911 1727096283.54631: getting the remaining hosts for this loop 18911 1727096283.54632: done getting the remaining hosts for this loop 18911 1727096283.54634: getting the next task for host managed_node1 18911 1727096283.54639: done getting next task for host managed_node1 18911 1727096283.54641: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18911 1727096283.54643: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.54645: getting variables 18911 1727096283.54646: in VariableManager get_vars() 18911 1727096283.54654: Calling all_inventory to load vars for managed_node1 18911 1727096283.54656: Calling groups_inventory to load vars for managed_node1 18911 1727096283.54659: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.54666: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.54676: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.54679: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.54850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.55053: done with get_vars() 18911 1727096283.55064: done getting variables 18911 1727096283.55130: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096283.55332: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:58:03 -0400 (0:00:00.124) 0:00:02.668 ****** 18911 1727096283.55385: entering _queue_task() for managed_node1/command 18911 1727096283.55387: Creating lock for command 18911 1727096283.55816: worker is 1 (out of 1 available) 18911 1727096283.55827: exiting _queue_task() for managed_node1/command 18911 1727096283.55838: done queuing things up, now waiting for results queue to drain 18911 1727096283.55840: waiting for pending results... 18911 1727096283.56030: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 18911 1727096283.56128: in run() - task 0afff68d-5257-09a7-aae1-0000000000ad 18911 1727096283.56174: variable 'ansible_search_path' from source: unknown 18911 1727096283.56178: variable 'ansible_search_path' from source: unknown 18911 1727096283.56200: calling self._execute() 18911 1727096283.56284: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.56296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.56345: variable 'omit' from source: magic vars 18911 1727096283.56692: variable 'ansible_distribution' from source: facts 18911 1727096283.56711: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18911 1727096283.56859: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.56973: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18911 1727096283.56976: when evaluation is False, skipping this task 18911 1727096283.56979: _execute() done 18911 1727096283.56981: dumping result to json 18911 1727096283.56983: done dumping result, returning 18911 1727096283.56986: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0afff68d-5257-09a7-aae1-0000000000ad] 18911 1727096283.56989: sending task result for task 0afff68d-5257-09a7-aae1-0000000000ad 18911 1727096283.57215: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000ad skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18911 1727096283.57277: no more pending results, returning what we have 18911 1727096283.57281: results queue empty 18911 1727096283.57281: checking for any_errors_fatal 18911 1727096283.57283: done checking for any_errors_fatal 18911 1727096283.57284: checking for max_fail_percentage 18911 1727096283.57286: done checking for max_fail_percentage 18911 1727096283.57286: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.57287: done checking to see if all hosts have failed 18911 1727096283.57288: getting the remaining hosts for this loop 18911 1727096283.57289: done getting the remaining hosts for this loop 18911 1727096283.57293: getting the next task for host managed_node1 18911 1727096283.57301: done getting next task for host managed_node1 18911 1727096283.57304: ^ task is: TASK: Install yum-utils package 18911 1727096283.57308: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.57311: getting variables 18911 1727096283.57313: in VariableManager get_vars() 18911 1727096283.57349: Calling all_inventory to load vars for managed_node1 18911 1727096283.57352: Calling groups_inventory to load vars for managed_node1 18911 1727096283.57356: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.57375: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.57379: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.57383: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.57846: WORKER PROCESS EXITING 18911 1727096283.57865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.58127: done with get_vars() 18911 1727096283.58141: done getting variables 18911 1727096283.58251: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:58:03 -0400 (0:00:00.028) 0:00:02.697 ****** 18911 1727096283.58283: entering _queue_task() for managed_node1/package 18911 1727096283.58285: Creating lock for package 18911 1727096283.58610: worker is 1 (out of 1 available) 18911 1727096283.58621: exiting _queue_task() for managed_node1/package 18911 1727096283.58633: done queuing things up, now waiting for results queue to drain 18911 1727096283.58634: waiting for pending results... 18911 1727096283.58837: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 18911 1727096283.58960: in run() - task 0afff68d-5257-09a7-aae1-0000000000ae 18911 1727096283.58981: variable 'ansible_search_path' from source: unknown 18911 1727096283.58988: variable 'ansible_search_path' from source: unknown 18911 1727096283.59030: calling self._execute() 18911 1727096283.59169: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.59177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.59180: variable 'omit' from source: magic vars 18911 1727096283.59550: variable 'ansible_distribution' from source: facts 18911 1727096283.59573: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18911 1727096283.59715: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.59729: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18911 1727096283.59736: when evaluation is False, skipping this task 18911 1727096283.59743: _execute() done 18911 1727096283.59749: dumping result to json 18911 1727096283.59756: done dumping result, returning 18911 1727096283.59776: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0afff68d-5257-09a7-aae1-0000000000ae] 18911 1727096283.59831: sending task result for task 0afff68d-5257-09a7-aae1-0000000000ae 18911 1727096283.59910: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000ae 18911 1727096283.59913: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18911 1727096283.59984: no more pending results, returning what we have 18911 1727096283.59988: results queue empty 18911 1727096283.59988: checking for any_errors_fatal 18911 1727096283.59995: done checking for any_errors_fatal 18911 1727096283.59996: checking for max_fail_percentage 18911 1727096283.59997: done checking for max_fail_percentage 18911 1727096283.59998: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.59999: done checking to see if all hosts have failed 18911 1727096283.59999: getting the remaining hosts for this loop 18911 1727096283.60001: done getting the remaining hosts for this loop 18911 1727096283.60004: getting the next task for host managed_node1 18911 1727096283.60012: done getting next task for host managed_node1 18911 1727096283.60014: ^ task is: TASK: Enable EPEL 7 18911 1727096283.60018: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.60021: getting variables 18911 1727096283.60022: in VariableManager get_vars() 18911 1727096283.60052: Calling all_inventory to load vars for managed_node1 18911 1727096283.60054: Calling groups_inventory to load vars for managed_node1 18911 1727096283.60058: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.60075: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.60078: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.60081: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.60373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.60723: done with get_vars() 18911 1727096283.60733: done getting variables 18911 1727096283.60793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:58:03 -0400 (0:00:00.025) 0:00:02.722 ****** 18911 1727096283.60826: entering _queue_task() for managed_node1/command 18911 1727096283.61180: worker is 1 (out of 1 available) 18911 1727096283.61190: exiting _queue_task() for managed_node1/command 18911 1727096283.61200: done queuing things up, now waiting for results queue to drain 18911 1727096283.61202: waiting for pending results... 18911 1727096283.61436: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 18911 1727096283.61463: in run() - task 0afff68d-5257-09a7-aae1-0000000000af 18911 1727096283.61490: variable 'ansible_search_path' from source: unknown 18911 1727096283.61499: variable 'ansible_search_path' from source: unknown 18911 1727096283.61541: calling self._execute() 18911 1727096283.61626: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.61701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.61704: variable 'omit' from source: magic vars 18911 1727096283.62102: variable 'ansible_distribution' from source: facts 18911 1727096283.62121: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18911 1727096283.62300: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.62313: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18911 1727096283.62321: when evaluation is False, skipping this task 18911 1727096283.62328: _execute() done 18911 1727096283.62335: dumping result to json 18911 1727096283.62343: done dumping result, returning 18911 1727096283.62358: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0afff68d-5257-09a7-aae1-0000000000af] 18911 1727096283.62400: sending task result for task 0afff68d-5257-09a7-aae1-0000000000af 18911 1727096283.62613: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000af 18911 1727096283.62616: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18911 1727096283.62689: no more pending results, returning what we have 18911 1727096283.62693: results queue empty 18911 1727096283.62694: checking for any_errors_fatal 18911 1727096283.62699: done checking for any_errors_fatal 18911 1727096283.62700: checking for max_fail_percentage 18911 1727096283.62702: done checking for max_fail_percentage 18911 1727096283.62703: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.62703: done checking to see if all hosts have failed 18911 1727096283.62704: getting the remaining hosts for this loop 18911 1727096283.62706: done getting the remaining hosts for this loop 18911 1727096283.62709: getting the next task for host managed_node1 18911 1727096283.62718: done getting next task for host managed_node1 18911 1727096283.62720: ^ task is: TASK: Enable EPEL 8 18911 1727096283.62725: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.62729: getting variables 18911 1727096283.62731: in VariableManager get_vars() 18911 1727096283.62764: Calling all_inventory to load vars for managed_node1 18911 1727096283.62769: Calling groups_inventory to load vars for managed_node1 18911 1727096283.62773: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.62900: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.62905: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.62911: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.63184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.63384: done with get_vars() 18911 1727096283.63393: done getting variables 18911 1727096283.63457: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:58:03 -0400 (0:00:00.026) 0:00:02.749 ****** 18911 1727096283.63490: entering _queue_task() for managed_node1/command 18911 1727096283.63741: worker is 1 (out of 1 available) 18911 1727096283.63754: exiting _queue_task() for managed_node1/command 18911 1727096283.63883: done queuing things up, now waiting for results queue to drain 18911 1727096283.63885: waiting for pending results... 18911 1727096283.64111: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 18911 1727096283.64173: in run() - task 0afff68d-5257-09a7-aae1-0000000000b0 18911 1727096283.64177: variable 'ansible_search_path' from source: unknown 18911 1727096283.64180: variable 'ansible_search_path' from source: unknown 18911 1727096283.64228: calling self._execute() 18911 1727096283.64316: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.64336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.64338: variable 'omit' from source: magic vars 18911 1727096283.64697: variable 'ansible_distribution' from source: facts 18911 1727096283.64750: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18911 1727096283.64842: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.64859: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18911 1727096283.64876: when evaluation is False, skipping this task 18911 1727096283.64882: _execute() done 18911 1727096283.64889: dumping result to json 18911 1727096283.64894: done dumping result, returning 18911 1727096283.64971: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0afff68d-5257-09a7-aae1-0000000000b0] 18911 1727096283.64974: sending task result for task 0afff68d-5257-09a7-aae1-0000000000b0 18911 1727096283.65037: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000b0 18911 1727096283.65040: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18911 1727096283.65096: no more pending results, returning what we have 18911 1727096283.65100: results queue empty 18911 1727096283.65100: checking for any_errors_fatal 18911 1727096283.65105: done checking for any_errors_fatal 18911 1727096283.65105: checking for max_fail_percentage 18911 1727096283.65107: done checking for max_fail_percentage 18911 1727096283.65108: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.65108: done checking to see if all hosts have failed 18911 1727096283.65109: getting the remaining hosts for this loop 18911 1727096283.65111: done getting the remaining hosts for this loop 18911 1727096283.65115: getting the next task for host managed_node1 18911 1727096283.65125: done getting next task for host managed_node1 18911 1727096283.65127: ^ task is: TASK: Enable EPEL 6 18911 1727096283.65131: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.65134: getting variables 18911 1727096283.65136: in VariableManager get_vars() 18911 1727096283.65171: Calling all_inventory to load vars for managed_node1 18911 1727096283.65174: Calling groups_inventory to load vars for managed_node1 18911 1727096283.65179: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.65192: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.65196: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.65199: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.65604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.65835: done with get_vars() 18911 1727096283.65845: done getting variables 18911 1727096283.65925: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:58:03 -0400 (0:00:00.024) 0:00:02.773 ****** 18911 1727096283.65954: entering _queue_task() for managed_node1/copy 18911 1727096283.66212: worker is 1 (out of 1 available) 18911 1727096283.66224: exiting _queue_task() for managed_node1/copy 18911 1727096283.66319: done queuing things up, now waiting for results queue to drain 18911 1727096283.66321: waiting for pending results... 18911 1727096283.66623: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 18911 1727096283.66705: in run() - task 0afff68d-5257-09a7-aae1-0000000000b2 18911 1727096283.66731: variable 'ansible_search_path' from source: unknown 18911 1727096283.66739: variable 'ansible_search_path' from source: unknown 18911 1727096283.66782: calling self._execute() 18911 1727096283.66859: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.66878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.66892: variable 'omit' from source: magic vars 18911 1727096283.67392: variable 'ansible_distribution' from source: facts 18911 1727096283.67410: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18911 1727096283.67535: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.67563: Evaluated conditional (ansible_distribution_major_version == '6'): False 18911 1727096283.67579: when evaluation is False, skipping this task 18911 1727096283.67608: _execute() done 18911 1727096283.67611: dumping result to json 18911 1727096283.67613: done dumping result, returning 18911 1727096283.67616: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0afff68d-5257-09a7-aae1-0000000000b2] 18911 1727096283.67684: sending task result for task 0afff68d-5257-09a7-aae1-0000000000b2 18911 1727096283.67754: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18911 1727096283.67810: no more pending results, returning what we have 18911 1727096283.67814: results queue empty 18911 1727096283.67815: checking for any_errors_fatal 18911 1727096283.67820: done checking for any_errors_fatal 18911 1727096283.67821: checking for max_fail_percentage 18911 1727096283.67823: done checking for max_fail_percentage 18911 1727096283.67824: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.67824: done checking to see if all hosts have failed 18911 1727096283.67825: getting the remaining hosts for this loop 18911 1727096283.67826: done getting the remaining hosts for this loop 18911 1727096283.67830: getting the next task for host managed_node1 18911 1727096283.67840: done getting next task for host managed_node1 18911 1727096283.67843: ^ task is: TASK: Set network provider to 'nm' 18911 1727096283.67845: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.67848: getting variables 18911 1727096283.67850: in VariableManager get_vars() 18911 1727096283.67882: Calling all_inventory to load vars for managed_node1 18911 1727096283.67885: Calling groups_inventory to load vars for managed_node1 18911 1727096283.67888: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.67899: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.67902: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.67904: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.68377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.68564: done with get_vars() 18911 1727096283.68575: done getting variables 18911 1727096283.68609: WORKER PROCESS EXITING 18911 1727096283.68642: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Monday 23 September 2024 08:58:03 -0400 (0:00:00.027) 0:00:02.801 ****** 18911 1727096283.68670: entering _queue_task() for managed_node1/set_fact 18911 1727096283.68945: worker is 1 (out of 1 available) 18911 1727096283.68956: exiting _queue_task() for managed_node1/set_fact 18911 1727096283.68970: done queuing things up, now waiting for results queue to drain 18911 1727096283.68972: waiting for pending results... 18911 1727096283.69184: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 18911 1727096283.69279: in run() - task 0afff68d-5257-09a7-aae1-000000000007 18911 1727096283.69298: variable 'ansible_search_path' from source: unknown 18911 1727096283.69339: calling self._execute() 18911 1727096283.69420: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.69431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.69446: variable 'omit' from source: magic vars 18911 1727096283.69554: variable 'omit' from source: magic vars 18911 1727096283.69594: variable 'omit' from source: magic vars 18911 1727096283.69639: variable 'omit' from source: magic vars 18911 1727096283.69738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096283.69741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096283.69755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096283.69784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.69800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.69832: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096283.69846: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.69854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.69964: Set connection var ansible_shell_executable to /bin/sh 18911 1727096283.69976: Set connection var ansible_timeout to 10 18911 1727096283.69998: Set connection var ansible_shell_type to sh 18911 1727096283.70001: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096283.70005: Set connection var ansible_pipelining to False 18911 1727096283.70014: Set connection var ansible_connection to ssh 18911 1727096283.70062: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.70065: variable 'ansible_connection' from source: unknown 18911 1727096283.70068: variable 'ansible_module_compression' from source: unknown 18911 1727096283.70071: variable 'ansible_shell_type' from source: unknown 18911 1727096283.70073: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.70075: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.70077: variable 'ansible_pipelining' from source: unknown 18911 1727096283.70079: variable 'ansible_timeout' from source: unknown 18911 1727096283.70081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.70281: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096283.70285: variable 'omit' from source: magic vars 18911 1727096283.70287: starting attempt loop 18911 1727096283.70289: running the handler 18911 1727096283.70291: handler run complete 18911 1727096283.70293: attempt loop complete, returning result 18911 1727096283.70295: _execute() done 18911 1727096283.70297: dumping result to json 18911 1727096283.70299: done dumping result, returning 18911 1727096283.70306: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0afff68d-5257-09a7-aae1-000000000007] 18911 1727096283.70313: sending task result for task 0afff68d-5257-09a7-aae1-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 18911 1727096283.70689: no more pending results, returning what we have 18911 1727096283.70692: results queue empty 18911 1727096283.70692: checking for any_errors_fatal 18911 1727096283.70696: done checking for any_errors_fatal 18911 1727096283.70697: checking for max_fail_percentage 18911 1727096283.70698: done checking for max_fail_percentage 18911 1727096283.70699: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.70700: done checking to see if all hosts have failed 18911 1727096283.70701: getting the remaining hosts for this loop 18911 1727096283.70702: done getting the remaining hosts for this loop 18911 1727096283.70704: getting the next task for host managed_node1 18911 1727096283.70710: done getting next task for host managed_node1 18911 1727096283.70712: ^ task is: TASK: meta (flush_handlers) 18911 1727096283.70713: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.70717: getting variables 18911 1727096283.70718: in VariableManager get_vars() 18911 1727096283.70742: Calling all_inventory to load vars for managed_node1 18911 1727096283.70744: Calling groups_inventory to load vars for managed_node1 18911 1727096283.70747: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.70757: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.70760: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.70763: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.70917: done sending task result for task 0afff68d-5257-09a7-aae1-000000000007 18911 1727096283.70921: WORKER PROCESS EXITING 18911 1727096283.70941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.71330: done with get_vars() 18911 1727096283.71339: done getting variables 18911 1727096283.71403: in VariableManager get_vars() 18911 1727096283.71416: Calling all_inventory to load vars for managed_node1 18911 1727096283.71419: Calling groups_inventory to load vars for managed_node1 18911 1727096283.71421: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.71425: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.71428: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.71430: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.71558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.71718: done with get_vars() 18911 1727096283.71732: done queuing things up, now waiting for results queue to drain 18911 1727096283.71734: results queue empty 18911 1727096283.71735: checking for any_errors_fatal 18911 1727096283.71736: done checking for any_errors_fatal 18911 1727096283.71737: checking for max_fail_percentage 18911 1727096283.71742: done checking for max_fail_percentage 18911 1727096283.71743: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.71743: done checking to see if all hosts have failed 18911 1727096283.71744: getting the remaining hosts for this loop 18911 1727096283.71745: done getting the remaining hosts for this loop 18911 1727096283.71747: getting the next task for host managed_node1 18911 1727096283.71750: done getting next task for host managed_node1 18911 1727096283.71752: ^ task is: TASK: meta (flush_handlers) 18911 1727096283.71753: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.71760: getting variables 18911 1727096283.71761: in VariableManager get_vars() 18911 1727096283.71770: Calling all_inventory to load vars for managed_node1 18911 1727096283.71772: Calling groups_inventory to load vars for managed_node1 18911 1727096283.71774: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.71778: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.71779: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.71782: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.71900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.72109: done with get_vars() 18911 1727096283.72117: done getting variables 18911 1727096283.72161: in VariableManager get_vars() 18911 1727096283.72170: Calling all_inventory to load vars for managed_node1 18911 1727096283.72173: Calling groups_inventory to load vars for managed_node1 18911 1727096283.72180: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.72186: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.72188: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.72191: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.72321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.72510: done with get_vars() 18911 1727096283.72521: done queuing things up, now waiting for results queue to drain 18911 1727096283.72523: results queue empty 18911 1727096283.72523: checking for any_errors_fatal 18911 1727096283.72525: done checking for any_errors_fatal 18911 1727096283.72525: checking for max_fail_percentage 18911 1727096283.72526: done checking for max_fail_percentage 18911 1727096283.72527: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.72528: done checking to see if all hosts have failed 18911 1727096283.72528: getting the remaining hosts for this loop 18911 1727096283.72529: done getting the remaining hosts for this loop 18911 1727096283.72531: getting the next task for host managed_node1 18911 1727096283.72534: done getting next task for host managed_node1 18911 1727096283.72535: ^ task is: None 18911 1727096283.72536: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.72538: done queuing things up, now waiting for results queue to drain 18911 1727096283.72538: results queue empty 18911 1727096283.72539: checking for any_errors_fatal 18911 1727096283.72540: done checking for any_errors_fatal 18911 1727096283.72540: checking for max_fail_percentage 18911 1727096283.72541: done checking for max_fail_percentage 18911 1727096283.72542: checking to see if all hosts have failed and the running result is not ok 18911 1727096283.72543: done checking to see if all hosts have failed 18911 1727096283.72544: getting the next task for host managed_node1 18911 1727096283.72547: done getting next task for host managed_node1 18911 1727096283.72548: ^ task is: None 18911 1727096283.72549: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.72595: in VariableManager get_vars() 18911 1727096283.72614: done with get_vars() 18911 1727096283.72621: in VariableManager get_vars() 18911 1727096283.72630: done with get_vars() 18911 1727096283.72634: variable 'omit' from source: magic vars 18911 1727096283.72666: in VariableManager get_vars() 18911 1727096283.72678: done with get_vars() 18911 1727096283.72699: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18911 1727096283.72886: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096283.72913: getting the remaining hosts for this loop 18911 1727096283.72914: done getting the remaining hosts for this loop 18911 1727096283.72917: getting the next task for host managed_node1 18911 1727096283.72919: done getting next task for host managed_node1 18911 1727096283.72921: ^ task is: TASK: Gathering Facts 18911 1727096283.72922: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096283.72924: getting variables 18911 1727096283.72925: in VariableManager get_vars() 18911 1727096283.72932: Calling all_inventory to load vars for managed_node1 18911 1727096283.72934: Calling groups_inventory to load vars for managed_node1 18911 1727096283.72936: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096283.72947: Calling all_plugins_play to load vars for managed_node1 18911 1727096283.72960: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096283.72963: Calling groups_plugins_play to load vars for managed_node1 18911 1727096283.73138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096283.73318: done with get_vars() 18911 1727096283.73326: done getting variables 18911 1727096283.73366: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Monday 23 September 2024 08:58:03 -0400 (0:00:00.047) 0:00:02.848 ****** 18911 1727096283.73396: entering _queue_task() for managed_node1/gather_facts 18911 1727096283.73733: worker is 1 (out of 1 available) 18911 1727096283.73745: exiting _queue_task() for managed_node1/gather_facts 18911 1727096283.73755: done queuing things up, now waiting for results queue to drain 18911 1727096283.73756: waiting for pending results... 18911 1727096283.73988: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096283.74148: in run() - task 0afff68d-5257-09a7-aae1-0000000000d8 18911 1727096283.74152: variable 'ansible_search_path' from source: unknown 18911 1727096283.74171: calling self._execute() 18911 1727096283.74261: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.74277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.74366: variable 'omit' from source: magic vars 18911 1727096283.74674: variable 'ansible_distribution_major_version' from source: facts 18911 1727096283.74700: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096283.74711: variable 'omit' from source: magic vars 18911 1727096283.74742: variable 'omit' from source: magic vars 18911 1727096283.74785: variable 'omit' from source: magic vars 18911 1727096283.74831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096283.74871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096283.74895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096283.74922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.74935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096283.74966: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096283.74976: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.75017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.75085: Set connection var ansible_shell_executable to /bin/sh 18911 1727096283.75097: Set connection var ansible_timeout to 10 18911 1727096283.75103: Set connection var ansible_shell_type to sh 18911 1727096283.75113: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096283.75126: Set connection var ansible_pipelining to False 18911 1727096283.75135: Set connection var ansible_connection to ssh 18911 1727096283.75156: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.75171: variable 'ansible_connection' from source: unknown 18911 1727096283.75173: variable 'ansible_module_compression' from source: unknown 18911 1727096283.75176: variable 'ansible_shell_type' from source: unknown 18911 1727096283.75234: variable 'ansible_shell_executable' from source: unknown 18911 1727096283.75237: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096283.75239: variable 'ansible_pipelining' from source: unknown 18911 1727096283.75241: variable 'ansible_timeout' from source: unknown 18911 1727096283.75243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096283.75377: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096283.75392: variable 'omit' from source: magic vars 18911 1727096283.75400: starting attempt loop 18911 1727096283.75406: running the handler 18911 1727096283.75422: variable 'ansible_facts' from source: unknown 18911 1727096283.75441: _low_level_execute_command(): starting 18911 1727096283.75457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096283.76184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096283.76223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096283.76331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.76346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.76365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.76497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096283.78896: stdout chunk (state=3): >>>/root <<< 18911 1727096283.79100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096283.79118: stdout chunk (state=3): >>><<< 18911 1727096283.79130: stderr chunk (state=3): >>><<< 18911 1727096283.79175: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096283.79178: _low_level_execute_command(): starting 18911 1727096283.79187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252 `" && echo ansible-tmp-1727096283.7915494-19047-249883186313252="` echo /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252 `" ) && sleep 0' 18911 1727096283.79827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096283.79842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096283.79856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096283.79882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096283.79939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.80005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.80022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.80056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.80199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096283.83074: stdout chunk (state=3): >>>ansible-tmp-1727096283.7915494-19047-249883186313252=/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252 <<< 18911 1727096283.83365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096283.83673: stdout chunk (state=3): >>><<< 18911 1727096283.83676: stderr chunk (state=3): >>><<< 18911 1727096283.83679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096283.7915494-19047-249883186313252=/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096283.83682: variable 'ansible_module_compression' from source: unknown 18911 1727096283.83685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096283.83686: variable 'ansible_facts' from source: unknown 18911 1727096283.84483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py 18911 1727096283.85203: Sending initial data 18911 1727096283.85206: Sent initial data (154 bytes) 18911 1727096283.86391: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.86597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.86600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.86610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.86879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096283.89207: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096283.89276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096283.89357: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpcsmxprij /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py <<< 18911 1727096283.89360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py" <<< 18911 1727096283.89410: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpcsmxprij" to remote "/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py" <<< 18911 1727096283.92397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096283.92485: stderr chunk (state=3): >>><<< 18911 1727096283.92489: stdout chunk (state=3): >>><<< 18911 1727096283.92500: done transferring module to remote 18911 1727096283.92514: _low_level_execute_command(): starting 18911 1727096283.92667: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/ /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py && sleep 0' 18911 1727096283.93611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096283.93695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.93742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096283.93752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096283.93815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096283.93824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.93978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.94044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096283.97141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096283.97149: stdout chunk (state=3): >>><<< 18911 1727096283.97151: stderr chunk (state=3): >>><<< 18911 1727096283.97466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096283.97474: _low_level_execute_command(): starting 18911 1727096283.97477: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/AnsiballZ_setup.py && sleep 0' 18911 1727096283.98400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096283.98678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096283.98691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096283.98807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096284.84920: stdout chunk (state=3): >>> <<< 18911 1727096284.84997: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD<<< 18911 1727096284.85121: stdout chunk (state=3): >>>_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "04", "epoch": "1727096284", "epoch_int": "1727096284", "date": "2024-09-23", "time": "08:58:04", "iso8601_micro": "2024-09-23T12:58:04.421546Z", "iso8601": "2024-09-23T12:58:04Z", "iso8601_basic": "20240923T085804421546", "iso8601_basic_short": "20240923T085804", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 437, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795229696, "block_size": 4096, "block_total": 65519099, "block_available": 63914851, "block_used": 1604248, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc"<<< 18911 1727096284.85156: stdout chunk (state=3): >>>: false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096284.87992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096284.88084: stderr chunk (state=3): >>><<< 18911 1727096284.88094: stdout chunk (state=3): >>><<< 18911 1727096284.88151: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "04", "epoch": "1727096284", "epoch_int": "1727096284", "date": "2024-09-23", "time": "08:58:04", "iso8601_micro": "2024-09-23T12:58:04.421546Z", "iso8601": "2024-09-23T12:58:04Z", "iso8601_basic": "20240923T085804421546", "iso8601_basic_short": "20240923T085804", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 437, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795229696, "block_size": 4096, "block_total": 65519099, "block_available": 63914851, "block_used": 1604248, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096284.88835: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096284.88863: _low_level_execute_command(): starting 18911 1727096284.88875: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096283.7915494-19047-249883186313252/ > /dev/null 2>&1 && sleep 0' 18911 1727096284.89543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096284.89572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096284.89682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096284.89698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096284.89718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096284.89827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096284.92574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096284.92632: stdout chunk (state=3): >>><<< 18911 1727096284.92650: stderr chunk (state=3): >>><<< 18911 1727096284.92686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096284.92700: handler run complete 18911 1727096284.92886: variable 'ansible_facts' from source: unknown 18911 1727096284.93007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096284.93363: variable 'ansible_facts' from source: unknown 18911 1727096284.93496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096284.93625: attempt loop complete, returning result 18911 1727096284.93635: _execute() done 18911 1727096284.93687: dumping result to json 18911 1727096284.93728: done dumping result, returning 18911 1727096284.93741: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-0000000000d8] 18911 1727096284.93822: sending task result for task 0afff68d-5257-09a7-aae1-0000000000d8 18911 1727096284.94327: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000d8 18911 1727096284.94329: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096284.94948: no more pending results, returning what we have 18911 1727096284.94951: results queue empty 18911 1727096284.94952: checking for any_errors_fatal 18911 1727096284.94953: done checking for any_errors_fatal 18911 1727096284.94953: checking for max_fail_percentage 18911 1727096284.94955: done checking for max_fail_percentage 18911 1727096284.94956: checking to see if all hosts have failed and the running result is not ok 18911 1727096284.94957: done checking to see if all hosts have failed 18911 1727096284.94957: getting the remaining hosts for this loop 18911 1727096284.94958: done getting the remaining hosts for this loop 18911 1727096284.94962: getting the next task for host managed_node1 18911 1727096284.94969: done getting next task for host managed_node1 18911 1727096284.94971: ^ task is: TASK: meta (flush_handlers) 18911 1727096284.94972: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096284.94976: getting variables 18911 1727096284.94977: in VariableManager get_vars() 18911 1727096284.94999: Calling all_inventory to load vars for managed_node1 18911 1727096284.95002: Calling groups_inventory to load vars for managed_node1 18911 1727096284.95005: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096284.95049: Calling all_plugins_play to load vars for managed_node1 18911 1727096284.95052: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096284.95056: Calling groups_plugins_play to load vars for managed_node1 18911 1727096284.95226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096284.95434: done with get_vars() 18911 1727096284.95444: done getting variables 18911 1727096284.95521: in VariableManager get_vars() 18911 1727096284.95530: Calling all_inventory to load vars for managed_node1 18911 1727096284.95532: Calling groups_inventory to load vars for managed_node1 18911 1727096284.95535: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096284.95539: Calling all_plugins_play to load vars for managed_node1 18911 1727096284.95541: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096284.95543: Calling groups_plugins_play to load vars for managed_node1 18911 1727096284.95710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096284.95910: done with get_vars() 18911 1727096284.95922: done queuing things up, now waiting for results queue to drain 18911 1727096284.95924: results queue empty 18911 1727096284.95925: checking for any_errors_fatal 18911 1727096284.95928: done checking for any_errors_fatal 18911 1727096284.95929: checking for max_fail_percentage 18911 1727096284.95930: done checking for max_fail_percentage 18911 1727096284.95930: checking to see if all hosts have failed and the running result is not ok 18911 1727096284.95936: done checking to see if all hosts have failed 18911 1727096284.95936: getting the remaining hosts for this loop 18911 1727096284.95937: done getting the remaining hosts for this loop 18911 1727096284.95940: getting the next task for host managed_node1 18911 1727096284.95944: done getting next task for host managed_node1 18911 1727096284.95946: ^ task is: TASK: Show inside ethernet tests 18911 1727096284.95947: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096284.95949: getting variables 18911 1727096284.95950: in VariableManager get_vars() 18911 1727096284.95957: Calling all_inventory to load vars for managed_node1 18911 1727096284.95959: Calling groups_inventory to load vars for managed_node1 18911 1727096284.95961: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096284.95965: Calling all_plugins_play to load vars for managed_node1 18911 1727096284.95969: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096284.95972: Calling groups_plugins_play to load vars for managed_node1 18911 1727096284.96122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096284.96322: done with get_vars() 18911 1727096284.96329: done getting variables 18911 1727096284.96406: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Monday 23 September 2024 08:58:04 -0400 (0:00:01.230) 0:00:04.078 ****** 18911 1727096284.96436: entering _queue_task() for managed_node1/debug 18911 1727096284.96438: Creating lock for debug 18911 1727096284.96951: worker is 1 (out of 1 available) 18911 1727096284.96963: exiting _queue_task() for managed_node1/debug 18911 1727096284.97675: done queuing things up, now waiting for results queue to drain 18911 1727096284.97677: waiting for pending results... 18911 1727096284.98189: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 18911 1727096284.98194: in run() - task 0afff68d-5257-09a7-aae1-00000000000b 18911 1727096284.98198: variable 'ansible_search_path' from source: unknown 18911 1727096284.98200: calling self._execute() 18911 1727096284.98332: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096284.98875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096284.98879: variable 'omit' from source: magic vars 18911 1727096284.99537: variable 'ansible_distribution_major_version' from source: facts 18911 1727096284.99684: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096284.99696: variable 'omit' from source: magic vars 18911 1727096284.99731: variable 'omit' from source: magic vars 18911 1727096284.99814: variable 'omit' from source: magic vars 18911 1727096284.99920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096284.99963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096285.00174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096285.00178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.00180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.00192: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096285.00204: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.00216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.00498: Set connection var ansible_shell_executable to /bin/sh 18911 1727096285.00502: Set connection var ansible_timeout to 10 18911 1727096285.00511: Set connection var ansible_shell_type to sh 18911 1727096285.00524: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096285.00539: Set connection var ansible_pipelining to False 18911 1727096285.00550: Set connection var ansible_connection to ssh 18911 1727096285.00579: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.00616: variable 'ansible_connection' from source: unknown 18911 1727096285.00626: variable 'ansible_module_compression' from source: unknown 18911 1727096285.00651: variable 'ansible_shell_type' from source: unknown 18911 1727096285.00659: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.00666: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.00755: variable 'ansible_pipelining' from source: unknown 18911 1727096285.00758: variable 'ansible_timeout' from source: unknown 18911 1727096285.00760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.01083: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096285.01087: variable 'omit' from source: magic vars 18911 1727096285.01090: starting attempt loop 18911 1727096285.01092: running the handler 18911 1727096285.01190: handler run complete 18911 1727096285.01219: attempt loop complete, returning result 18911 1727096285.01267: _execute() done 18911 1727096285.01279: dumping result to json 18911 1727096285.01287: done dumping result, returning 18911 1727096285.01309: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [0afff68d-5257-09a7-aae1-00000000000b] 18911 1727096285.01408: sending task result for task 0afff68d-5257-09a7-aae1-00000000000b ok: [managed_node1] => {} MSG: Inside ethernet tests 18911 1727096285.01789: no more pending results, returning what we have 18911 1727096285.01793: results queue empty 18911 1727096285.01794: checking for any_errors_fatal 18911 1727096285.01796: done checking for any_errors_fatal 18911 1727096285.01797: checking for max_fail_percentage 18911 1727096285.01798: done checking for max_fail_percentage 18911 1727096285.01799: checking to see if all hosts have failed and the running result is not ok 18911 1727096285.01800: done checking to see if all hosts have failed 18911 1727096285.01801: getting the remaining hosts for this loop 18911 1727096285.01802: done getting the remaining hosts for this loop 18911 1727096285.01806: getting the next task for host managed_node1 18911 1727096285.01813: done getting next task for host managed_node1 18911 1727096285.01816: ^ task is: TASK: Show network_provider 18911 1727096285.01819: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.01823: getting variables 18911 1727096285.01825: in VariableManager get_vars() 18911 1727096285.01857: Calling all_inventory to load vars for managed_node1 18911 1727096285.01860: Calling groups_inventory to load vars for managed_node1 18911 1727096285.01863: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.02073: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.02077: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.02081: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.02284: done sending task result for task 0afff68d-5257-09a7-aae1-00000000000b 18911 1727096285.02288: WORKER PROCESS EXITING 18911 1727096285.02318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.02519: done with get_vars() 18911 1727096285.02528: done getting variables 18911 1727096285.02586: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Monday 23 September 2024 08:58:05 -0400 (0:00:00.061) 0:00:04.140 ****** 18911 1727096285.02617: entering _queue_task() for managed_node1/debug 18911 1727096285.02969: worker is 1 (out of 1 available) 18911 1727096285.02981: exiting _queue_task() for managed_node1/debug 18911 1727096285.02990: done queuing things up, now waiting for results queue to drain 18911 1727096285.02991: waiting for pending results... 18911 1727096285.03180: running TaskExecutor() for managed_node1/TASK: Show network_provider 18911 1727096285.03258: in run() - task 0afff68d-5257-09a7-aae1-00000000000c 18911 1727096285.03285: variable 'ansible_search_path' from source: unknown 18911 1727096285.03325: calling self._execute() 18911 1727096285.03409: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.03420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.03433: variable 'omit' from source: magic vars 18911 1727096285.03820: variable 'ansible_distribution_major_version' from source: facts 18911 1727096285.03873: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096285.03877: variable 'omit' from source: magic vars 18911 1727096285.03903: variable 'omit' from source: magic vars 18911 1727096285.04057: variable 'omit' from source: magic vars 18911 1727096285.04061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096285.04073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096285.04099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096285.04121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.04137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.04372: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096285.04375: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.04382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.04574: Set connection var ansible_shell_executable to /bin/sh 18911 1727096285.04578: Set connection var ansible_timeout to 10 18911 1727096285.04581: Set connection var ansible_shell_type to sh 18911 1727096285.04583: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096285.04585: Set connection var ansible_pipelining to False 18911 1727096285.04587: Set connection var ansible_connection to ssh 18911 1727096285.04589: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.04591: variable 'ansible_connection' from source: unknown 18911 1727096285.04593: variable 'ansible_module_compression' from source: unknown 18911 1727096285.04601: variable 'ansible_shell_type' from source: unknown 18911 1727096285.04606: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.04608: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.04610: variable 'ansible_pipelining' from source: unknown 18911 1727096285.04612: variable 'ansible_timeout' from source: unknown 18911 1727096285.04614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.05065: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096285.05070: variable 'omit' from source: magic vars 18911 1727096285.05073: starting attempt loop 18911 1727096285.05075: running the handler 18911 1727096285.05369: variable 'network_provider' from source: set_fact 18911 1727096285.05373: variable 'network_provider' from source: set_fact 18911 1727096285.05497: handler run complete 18911 1727096285.05519: attempt loop complete, returning result 18911 1727096285.05527: _execute() done 18911 1727096285.05534: dumping result to json 18911 1727096285.05541: done dumping result, returning 18911 1727096285.05555: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0afff68d-5257-09a7-aae1-00000000000c] 18911 1727096285.05565: sending task result for task 0afff68d-5257-09a7-aae1-00000000000c ok: [managed_node1] => { "network_provider": "nm" } 18911 1727096285.05751: no more pending results, returning what we have 18911 1727096285.05755: results queue empty 18911 1727096285.05756: checking for any_errors_fatal 18911 1727096285.05764: done checking for any_errors_fatal 18911 1727096285.05765: checking for max_fail_percentage 18911 1727096285.05768: done checking for max_fail_percentage 18911 1727096285.05770: checking to see if all hosts have failed and the running result is not ok 18911 1727096285.05770: done checking to see if all hosts have failed 18911 1727096285.05771: getting the remaining hosts for this loop 18911 1727096285.05772: done getting the remaining hosts for this loop 18911 1727096285.05776: getting the next task for host managed_node1 18911 1727096285.05784: done getting next task for host managed_node1 18911 1727096285.05786: ^ task is: TASK: meta (flush_handlers) 18911 1727096285.05789: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.05792: getting variables 18911 1727096285.05808: in VariableManager get_vars() 18911 1727096285.05840: Calling all_inventory to load vars for managed_node1 18911 1727096285.05940: Calling groups_inventory to load vars for managed_node1 18911 1727096285.05945: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.05965: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.05970: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.06001: done sending task result for task 0afff68d-5257-09a7-aae1-00000000000c 18911 1727096285.06004: WORKER PROCESS EXITING 18911 1727096285.06073: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.06461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.06714: done with get_vars() 18911 1727096285.06729: done getting variables 18911 1727096285.06806: in VariableManager get_vars() 18911 1727096285.06817: Calling all_inventory to load vars for managed_node1 18911 1727096285.06819: Calling groups_inventory to load vars for managed_node1 18911 1727096285.06821: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.06826: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.06829: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.06832: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.07006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.07196: done with get_vars() 18911 1727096285.07209: done queuing things up, now waiting for results queue to drain 18911 1727096285.07211: results queue empty 18911 1727096285.07212: checking for any_errors_fatal 18911 1727096285.07215: done checking for any_errors_fatal 18911 1727096285.07215: checking for max_fail_percentage 18911 1727096285.07217: done checking for max_fail_percentage 18911 1727096285.07217: checking to see if all hosts have failed and the running result is not ok 18911 1727096285.07218: done checking to see if all hosts have failed 18911 1727096285.07219: getting the remaining hosts for this loop 18911 1727096285.07220: done getting the remaining hosts for this loop 18911 1727096285.07222: getting the next task for host managed_node1 18911 1727096285.07237: done getting next task for host managed_node1 18911 1727096285.07239: ^ task is: TASK: meta (flush_handlers) 18911 1727096285.07240: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.07243: getting variables 18911 1727096285.07244: in VariableManager get_vars() 18911 1727096285.07252: Calling all_inventory to load vars for managed_node1 18911 1727096285.07254: Calling groups_inventory to load vars for managed_node1 18911 1727096285.07257: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.07261: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.07263: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.07266: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.07405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.07596: done with get_vars() 18911 1727096285.07604: done getting variables 18911 1727096285.07647: in VariableManager get_vars() 18911 1727096285.07654: Calling all_inventory to load vars for managed_node1 18911 1727096285.07656: Calling groups_inventory to load vars for managed_node1 18911 1727096285.07658: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.07662: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.07671: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.07674: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.07809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.08017: done with get_vars() 18911 1727096285.08027: done queuing things up, now waiting for results queue to drain 18911 1727096285.08029: results queue empty 18911 1727096285.08030: checking for any_errors_fatal 18911 1727096285.08031: done checking for any_errors_fatal 18911 1727096285.08031: checking for max_fail_percentage 18911 1727096285.08035: done checking for max_fail_percentage 18911 1727096285.08050: checking to see if all hosts have failed and the running result is not ok 18911 1727096285.08051: done checking to see if all hosts have failed 18911 1727096285.08052: getting the remaining hosts for this loop 18911 1727096285.08053: done getting the remaining hosts for this loop 18911 1727096285.08056: getting the next task for host managed_node1 18911 1727096285.08058: done getting next task for host managed_node1 18911 1727096285.08059: ^ task is: None 18911 1727096285.08061: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.08062: done queuing things up, now waiting for results queue to drain 18911 1727096285.08062: results queue empty 18911 1727096285.08063: checking for any_errors_fatal 18911 1727096285.08064: done checking for any_errors_fatal 18911 1727096285.08064: checking for max_fail_percentage 18911 1727096285.08065: done checking for max_fail_percentage 18911 1727096285.08066: checking to see if all hosts have failed and the running result is not ok 18911 1727096285.08066: done checking to see if all hosts have failed 18911 1727096285.08069: getting the next task for host managed_node1 18911 1727096285.08072: done getting next task for host managed_node1 18911 1727096285.08072: ^ task is: None 18911 1727096285.08073: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.08114: in VariableManager get_vars() 18911 1727096285.08244: done with get_vars() 18911 1727096285.08250: in VariableManager get_vars() 18911 1727096285.08261: done with get_vars() 18911 1727096285.08265: variable 'omit' from source: magic vars 18911 1727096285.08296: in VariableManager get_vars() 18911 1727096285.08306: done with get_vars() 18911 1727096285.08441: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18911 1727096285.08886: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096285.09080: getting the remaining hosts for this loop 18911 1727096285.09082: done getting the remaining hosts for this loop 18911 1727096285.09084: getting the next task for host managed_node1 18911 1727096285.09087: done getting next task for host managed_node1 18911 1727096285.09089: ^ task is: TASK: Gathering Facts 18911 1727096285.09091: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096285.09093: getting variables 18911 1727096285.09094: in VariableManager get_vars() 18911 1727096285.09101: Calling all_inventory to load vars for managed_node1 18911 1727096285.09103: Calling groups_inventory to load vars for managed_node1 18911 1727096285.09106: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096285.09110: Calling all_plugins_play to load vars for managed_node1 18911 1727096285.09113: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096285.09115: Calling groups_plugins_play to load vars for managed_node1 18911 1727096285.09300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096285.09492: done with get_vars() 18911 1727096285.09499: done getting variables 18911 1727096285.09560: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Monday 23 September 2024 08:58:05 -0400 (0:00:00.069) 0:00:04.210 ****** 18911 1727096285.09594: entering _queue_task() for managed_node1/gather_facts 18911 1727096285.09928: worker is 1 (out of 1 available) 18911 1727096285.09940: exiting _queue_task() for managed_node1/gather_facts 18911 1727096285.10065: done queuing things up, now waiting for results queue to drain 18911 1727096285.10068: waiting for pending results... 18911 1727096285.10219: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096285.10356: in run() - task 0afff68d-5257-09a7-aae1-0000000000f0 18911 1727096285.10380: variable 'ansible_search_path' from source: unknown 18911 1727096285.10436: calling self._execute() 18911 1727096285.10596: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.10601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.10604: variable 'omit' from source: magic vars 18911 1727096285.11014: variable 'ansible_distribution_major_version' from source: facts 18911 1727096285.11054: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096285.11076: variable 'omit' from source: magic vars 18911 1727096285.11142: variable 'omit' from source: magic vars 18911 1727096285.11166: variable 'omit' from source: magic vars 18911 1727096285.11214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096285.11262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096285.11359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096285.11365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.11369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096285.11372: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096285.11374: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.11376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.11502: Set connection var ansible_shell_executable to /bin/sh 18911 1727096285.11574: Set connection var ansible_timeout to 10 18911 1727096285.11584: Set connection var ansible_shell_type to sh 18911 1727096285.11587: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096285.11589: Set connection var ansible_pipelining to False 18911 1727096285.11591: Set connection var ansible_connection to ssh 18911 1727096285.11680: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.11685: variable 'ansible_connection' from source: unknown 18911 1727096285.11687: variable 'ansible_module_compression' from source: unknown 18911 1727096285.11689: variable 'ansible_shell_type' from source: unknown 18911 1727096285.11692: variable 'ansible_shell_executable' from source: unknown 18911 1727096285.11694: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096285.11695: variable 'ansible_pipelining' from source: unknown 18911 1727096285.11697: variable 'ansible_timeout' from source: unknown 18911 1727096285.11698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096285.12034: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096285.12037: variable 'omit' from source: magic vars 18911 1727096285.12039: starting attempt loop 18911 1727096285.12041: running the handler 18911 1727096285.12043: variable 'ansible_facts' from source: unknown 18911 1727096285.12044: _low_level_execute_command(): starting 18911 1727096285.12046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096285.13262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096285.13323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096285.13486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096285.13557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096285.13692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096285.16104: stdout chunk (state=3): >>>/root <<< 18911 1727096285.16314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096285.16318: stderr chunk (state=3): >>><<< 18911 1727096285.16394: stdout chunk (state=3): >>><<< 18911 1727096285.16398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096285.16401: _low_level_execute_command(): starting 18911 1727096285.16490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790 `" && echo ansible-tmp-1727096285.1637104-19101-256476297997790="` echo /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790 `" ) && sleep 0' 18911 1727096285.17759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096285.17764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096285.17770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096285.17816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096285.17819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096285.17829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096285.17831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096285.17878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096285.17882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096285.17921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096285.17997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096285.20894: stdout chunk (state=3): >>>ansible-tmp-1727096285.1637104-19101-256476297997790=/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790 <<< 18911 1727096285.21063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096285.21066: stderr chunk (state=3): >>><<< 18911 1727096285.21071: stdout chunk (state=3): >>><<< 18911 1727096285.21089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096285.1637104-19101-256476297997790=/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18911 1727096285.21217: variable 'ansible_module_compression' from source: unknown 18911 1727096285.21220: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096285.21249: variable 'ansible_facts' from source: unknown 18911 1727096285.21466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py 18911 1727096285.21823: Sending initial data 18911 1727096285.21826: Sent initial data (154 bytes) 18911 1727096285.22508: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096285.22574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096285.24676: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096285.24848: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096285.24913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096285.25006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpthpthwb0 /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py <<< 18911 1727096285.25020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py" <<< 18911 1727096285.25092: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpthpthwb0" to remote "/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py" <<< 18911 1727096285.26802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096285.26844: stderr chunk (state=3): >>><<< 18911 1727096285.26856: stdout chunk (state=3): >>><<< 18911 1727096285.26884: done transferring module to remote 18911 1727096285.26900: _low_level_execute_command(): starting 18911 1727096285.26910: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/ /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py && sleep 0' 18911 1727096285.27676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096285.27692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096285.27706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096285.27726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096285.27773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096285.27790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096285.27881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096285.28004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096285.28213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096285.30027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096285.30446: stderr chunk (state=3): >>><<< 18911 1727096285.30450: stdout chunk (state=3): >>><<< 18911 1727096285.30452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096285.30454: _low_level_execute_command(): starting 18911 1727096285.30457: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/AnsiballZ_setup.py && sleep 0' 18911 1727096285.31413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096285.31484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096285.31547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096285.31622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096285.31800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18911 1727096286.08004: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "05", "epoch": "1727096285", "epoch_int": "1727096285", "date": "2024-09-23", "time": "08:58:05", "iso8601_micro": "2024-09-23T12:58:05.691760Z", "iso8601": "2024-09-23T12:58:05Z", "iso8601_basic": "20240923T085805691760", "iso8601_basic_short": "20240923T085805", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2932, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 599, "free": 2932}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 439, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795467264, "block_size": 4096, "block_total": 65519099, "block_available": 63914909, "block_used": 1604190, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096286.10250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096286.10253: stdout chunk (state=3): >>><<< 18911 1727096286.10255: stderr chunk (state=3): >>><<< 18911 1727096286.10294: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "05", "epoch": "1727096285", "epoch_int": "1727096285", "date": "2024-09-23", "time": "08:58:05", "iso8601_micro": "2024-09-23T12:58:05.691760Z", "iso8601": "2024-09-23T12:58:05Z", "iso8601_basic": "20240923T085805691760", "iso8601_basic_short": "20240923T085805", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4013671875, "5m": 0.33447265625, "15m": 0.16796875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2932, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 599, "free": 2932}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 439, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795467264, "block_size": 4096, "block_total": 65519099, "block_available": 63914909, "block_used": 1604190, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096286.10621: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096286.10655: _low_level_execute_command(): starting 18911 1727096286.10665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096285.1637104-19101-256476297997790/ > /dev/null 2>&1 && sleep 0' 18911 1727096286.11377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.11395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096286.11429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.11446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.11480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096286.11660: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.11723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.11753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.11894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.13784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.13793: stderr chunk (state=3): >>><<< 18911 1727096286.13797: stdout chunk (state=3): >>><<< 18911 1727096286.13811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.13819: handler run complete 18911 1727096286.13899: variable 'ansible_facts' from source: unknown 18911 1727096286.13962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.14146: variable 'ansible_facts' from source: unknown 18911 1727096286.14203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.14292: attempt loop complete, returning result 18911 1727096286.14297: _execute() done 18911 1727096286.14299: dumping result to json 18911 1727096286.14319: done dumping result, returning 18911 1727096286.14326: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-0000000000f0] 18911 1727096286.14331: sending task result for task 0afff68d-5257-09a7-aae1-0000000000f0 18911 1727096286.14614: done sending task result for task 0afff68d-5257-09a7-aae1-0000000000f0 18911 1727096286.14617: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096286.14836: no more pending results, returning what we have 18911 1727096286.14839: results queue empty 18911 1727096286.14840: checking for any_errors_fatal 18911 1727096286.14841: done checking for any_errors_fatal 18911 1727096286.14842: checking for max_fail_percentage 18911 1727096286.14844: done checking for max_fail_percentage 18911 1727096286.14844: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.14845: done checking to see if all hosts have failed 18911 1727096286.14846: getting the remaining hosts for this loop 18911 1727096286.14847: done getting the remaining hosts for this loop 18911 1727096286.14850: getting the next task for host managed_node1 18911 1727096286.14855: done getting next task for host managed_node1 18911 1727096286.14856: ^ task is: TASK: meta (flush_handlers) 18911 1727096286.14862: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.14866: getting variables 18911 1727096286.14880: in VariableManager get_vars() 18911 1727096286.14911: Calling all_inventory to load vars for managed_node1 18911 1727096286.14914: Calling groups_inventory to load vars for managed_node1 18911 1727096286.14921: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.14932: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.14935: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.14938: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.15141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.15513: done with get_vars() 18911 1727096286.15528: done getting variables 18911 1727096286.15605: in VariableManager get_vars() 18911 1727096286.15613: Calling all_inventory to load vars for managed_node1 18911 1727096286.15616: Calling groups_inventory to load vars for managed_node1 18911 1727096286.15618: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.15623: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.15625: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.15628: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.15797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.16000: done with get_vars() 18911 1727096286.16012: done queuing things up, now waiting for results queue to drain 18911 1727096286.16013: results queue empty 18911 1727096286.16014: checking for any_errors_fatal 18911 1727096286.16017: done checking for any_errors_fatal 18911 1727096286.16021: checking for max_fail_percentage 18911 1727096286.16022: done checking for max_fail_percentage 18911 1727096286.16023: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.16023: done checking to see if all hosts have failed 18911 1727096286.16024: getting the remaining hosts for this loop 18911 1727096286.16025: done getting the remaining hosts for this loop 18911 1727096286.16027: getting the next task for host managed_node1 18911 1727096286.16031: done getting next task for host managed_node1 18911 1727096286.16033: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18911 1727096286.16034: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.16036: getting variables 18911 1727096286.16037: in VariableManager get_vars() 18911 1727096286.16044: Calling all_inventory to load vars for managed_node1 18911 1727096286.16046: Calling groups_inventory to load vars for managed_node1 18911 1727096286.16048: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.16052: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.16055: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.16058: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.16196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.16408: done with get_vars() 18911 1727096286.16416: done getting variables 18911 1727096286.16458: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096286.16676: variable 'type' from source: play vars 18911 1727096286.16682: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Monday 23 September 2024 08:58:06 -0400 (0:00:01.071) 0:00:05.281 ****** 18911 1727096286.16734: entering _queue_task() for managed_node1/set_fact 18911 1727096286.17198: worker is 1 (out of 1 available) 18911 1727096286.17208: exiting _queue_task() for managed_node1/set_fact 18911 1727096286.17220: done queuing things up, now waiting for results queue to drain 18911 1727096286.17222: waiting for pending results... 18911 1727096286.17459: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 18911 1727096286.17499: in run() - task 0afff68d-5257-09a7-aae1-00000000000f 18911 1727096286.17511: variable 'ansible_search_path' from source: unknown 18911 1727096286.17538: calling self._execute() 18911 1727096286.17603: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.17607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.17616: variable 'omit' from source: magic vars 18911 1727096286.17887: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.17900: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.17903: variable 'omit' from source: magic vars 18911 1727096286.17932: variable 'omit' from source: magic vars 18911 1727096286.17953: variable 'type' from source: play vars 18911 1727096286.18010: variable 'type' from source: play vars 18911 1727096286.18013: variable 'interface' from source: play vars 18911 1727096286.18065: variable 'interface' from source: play vars 18911 1727096286.18078: variable 'omit' from source: magic vars 18911 1727096286.18113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096286.18146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096286.18165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096286.18179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.18188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.18211: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096286.18214: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.18217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.18293: Set connection var ansible_shell_executable to /bin/sh 18911 1727096286.18297: Set connection var ansible_timeout to 10 18911 1727096286.18299: Set connection var ansible_shell_type to sh 18911 1727096286.18306: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096286.18311: Set connection var ansible_pipelining to False 18911 1727096286.18316: Set connection var ansible_connection to ssh 18911 1727096286.18334: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.18337: variable 'ansible_connection' from source: unknown 18911 1727096286.18344: variable 'ansible_module_compression' from source: unknown 18911 1727096286.18347: variable 'ansible_shell_type' from source: unknown 18911 1727096286.18349: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.18572: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.18575: variable 'ansible_pipelining' from source: unknown 18911 1727096286.18577: variable 'ansible_timeout' from source: unknown 18911 1727096286.18579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.18582: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096286.18584: variable 'omit' from source: magic vars 18911 1727096286.18586: starting attempt loop 18911 1727096286.18587: running the handler 18911 1727096286.18593: handler run complete 18911 1727096286.18595: attempt loop complete, returning result 18911 1727096286.18597: _execute() done 18911 1727096286.18599: dumping result to json 18911 1727096286.18601: done dumping result, returning 18911 1727096286.18603: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [0afff68d-5257-09a7-aae1-00000000000f] 18911 1727096286.18605: sending task result for task 0afff68d-5257-09a7-aae1-00000000000f 18911 1727096286.18665: done sending task result for task 0afff68d-5257-09a7-aae1-00000000000f 18911 1727096286.18669: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 18911 1727096286.18748: no more pending results, returning what we have 18911 1727096286.18751: results queue empty 18911 1727096286.18753: checking for any_errors_fatal 18911 1727096286.18755: done checking for any_errors_fatal 18911 1727096286.18756: checking for max_fail_percentage 18911 1727096286.18757: done checking for max_fail_percentage 18911 1727096286.18759: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.18759: done checking to see if all hosts have failed 18911 1727096286.18760: getting the remaining hosts for this loop 18911 1727096286.18761: done getting the remaining hosts for this loop 18911 1727096286.18764: getting the next task for host managed_node1 18911 1727096286.18771: done getting next task for host managed_node1 18911 1727096286.18774: ^ task is: TASK: Include the task 'show_interfaces.yml' 18911 1727096286.18775: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.18779: getting variables 18911 1727096286.18781: in VariableManager get_vars() 18911 1727096286.18805: Calling all_inventory to load vars for managed_node1 18911 1727096286.18808: Calling groups_inventory to load vars for managed_node1 18911 1727096286.18810: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.18820: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.18822: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.18825: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.19004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.19300: done with get_vars() 18911 1727096286.19309: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Monday 23 September 2024 08:58:06 -0400 (0:00:00.027) 0:00:05.309 ****** 18911 1727096286.19528: entering _queue_task() for managed_node1/include_tasks 18911 1727096286.20055: worker is 1 (out of 1 available) 18911 1727096286.20371: exiting _queue_task() for managed_node1/include_tasks 18911 1727096286.20385: done queuing things up, now waiting for results queue to drain 18911 1727096286.20386: waiting for pending results... 18911 1727096286.20674: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18911 1727096286.20720: in run() - task 0afff68d-5257-09a7-aae1-000000000010 18911 1727096286.20739: variable 'ansible_search_path' from source: unknown 18911 1727096286.20804: calling self._execute() 18911 1727096286.20945: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.21173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.21176: variable 'omit' from source: magic vars 18911 1727096286.21659: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.21678: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.21687: _execute() done 18911 1727096286.21694: dumping result to json 18911 1727096286.21700: done dumping result, returning 18911 1727096286.21708: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-09a7-aae1-000000000010] 18911 1727096286.21716: sending task result for task 0afff68d-5257-09a7-aae1-000000000010 18911 1727096286.21840: no more pending results, returning what we have 18911 1727096286.21845: in VariableManager get_vars() 18911 1727096286.21883: Calling all_inventory to load vars for managed_node1 18911 1727096286.21886: Calling groups_inventory to load vars for managed_node1 18911 1727096286.21889: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.21903: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.21905: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.21907: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.22144: done sending task result for task 0afff68d-5257-09a7-aae1-000000000010 18911 1727096286.22148: WORKER PROCESS EXITING 18911 1727096286.22157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.22274: done with get_vars() 18911 1727096286.22279: variable 'ansible_search_path' from source: unknown 18911 1727096286.22289: we have included files to process 18911 1727096286.22289: generating all_blocks data 18911 1727096286.22290: done generating all_blocks data 18911 1727096286.22291: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.22295: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.22297: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.22405: in VariableManager get_vars() 18911 1727096286.22415: done with get_vars() 18911 1727096286.22490: done processing included file 18911 1727096286.22491: iterating over new_blocks loaded from include file 18911 1727096286.22492: in VariableManager get_vars() 18911 1727096286.22499: done with get_vars() 18911 1727096286.22500: filtering new block on tags 18911 1727096286.22514: done filtering new block on tags 18911 1727096286.22515: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18911 1727096286.22519: extending task lists for all hosts with included blocks 18911 1727096286.22569: done extending task lists 18911 1727096286.22570: done processing included files 18911 1727096286.22571: results queue empty 18911 1727096286.22571: checking for any_errors_fatal 18911 1727096286.22573: done checking for any_errors_fatal 18911 1727096286.22574: checking for max_fail_percentage 18911 1727096286.22574: done checking for max_fail_percentage 18911 1727096286.22575: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.22575: done checking to see if all hosts have failed 18911 1727096286.22576: getting the remaining hosts for this loop 18911 1727096286.22577: done getting the remaining hosts for this loop 18911 1727096286.22578: getting the next task for host managed_node1 18911 1727096286.22581: done getting next task for host managed_node1 18911 1727096286.22582: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18911 1727096286.22584: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.22585: getting variables 18911 1727096286.22586: in VariableManager get_vars() 18911 1727096286.22591: Calling all_inventory to load vars for managed_node1 18911 1727096286.22592: Calling groups_inventory to load vars for managed_node1 18911 1727096286.22594: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.22597: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.22599: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.22600: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.22689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.22959: done with get_vars() 18911 1727096286.22970: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:58:06 -0400 (0:00:00.034) 0:00:05.344 ****** 18911 1727096286.23019: entering _queue_task() for managed_node1/include_tasks 18911 1727096286.23293: worker is 1 (out of 1 available) 18911 1727096286.23307: exiting _queue_task() for managed_node1/include_tasks 18911 1727096286.23320: done queuing things up, now waiting for results queue to drain 18911 1727096286.23322: waiting for pending results... 18911 1727096286.23590: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18911 1727096286.23672: in run() - task 0afff68d-5257-09a7-aae1-000000000104 18911 1727096286.23705: variable 'ansible_search_path' from source: unknown 18911 1727096286.23811: variable 'ansible_search_path' from source: unknown 18911 1727096286.23815: calling self._execute() 18911 1727096286.23858: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.23873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.23891: variable 'omit' from source: magic vars 18911 1727096286.24339: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.24377: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.24389: _execute() done 18911 1727096286.24397: dumping result to json 18911 1727096286.24403: done dumping result, returning 18911 1727096286.24465: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-09a7-aae1-000000000104] 18911 1727096286.24474: sending task result for task 0afff68d-5257-09a7-aae1-000000000104 18911 1727096286.24543: done sending task result for task 0afff68d-5257-09a7-aae1-000000000104 18911 1727096286.24547: WORKER PROCESS EXITING 18911 1727096286.24581: no more pending results, returning what we have 18911 1727096286.24586: in VariableManager get_vars() 18911 1727096286.24622: Calling all_inventory to load vars for managed_node1 18911 1727096286.24625: Calling groups_inventory to load vars for managed_node1 18911 1727096286.24629: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.24644: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.24647: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.24650: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.25112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.25255: done with get_vars() 18911 1727096286.25263: variable 'ansible_search_path' from source: unknown 18911 1727096286.25264: variable 'ansible_search_path' from source: unknown 18911 1727096286.25293: we have included files to process 18911 1727096286.25294: generating all_blocks data 18911 1727096286.25295: done generating all_blocks data 18911 1727096286.25296: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.25297: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.25298: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.25527: done processing included file 18911 1727096286.25529: iterating over new_blocks loaded from include file 18911 1727096286.25530: in VariableManager get_vars() 18911 1727096286.25539: done with get_vars() 18911 1727096286.25540: filtering new block on tags 18911 1727096286.25550: done filtering new block on tags 18911 1727096286.25552: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18911 1727096286.25555: extending task lists for all hosts with included blocks 18911 1727096286.25623: done extending task lists 18911 1727096286.25624: done processing included files 18911 1727096286.25625: results queue empty 18911 1727096286.25625: checking for any_errors_fatal 18911 1727096286.25627: done checking for any_errors_fatal 18911 1727096286.25628: checking for max_fail_percentage 18911 1727096286.25628: done checking for max_fail_percentage 18911 1727096286.25629: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.25629: done checking to see if all hosts have failed 18911 1727096286.25630: getting the remaining hosts for this loop 18911 1727096286.25631: done getting the remaining hosts for this loop 18911 1727096286.25632: getting the next task for host managed_node1 18911 1727096286.25635: done getting next task for host managed_node1 18911 1727096286.25636: ^ task is: TASK: Gather current interface info 18911 1727096286.25638: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.25639: getting variables 18911 1727096286.25640: in VariableManager get_vars() 18911 1727096286.25646: Calling all_inventory to load vars for managed_node1 18911 1727096286.25648: Calling groups_inventory to load vars for managed_node1 18911 1727096286.25649: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.25653: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.25654: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.25656: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.25770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.25878: done with get_vars() 18911 1727096286.25885: done getting variables 18911 1727096286.25914: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:58:06 -0400 (0:00:00.029) 0:00:05.373 ****** 18911 1727096286.25935: entering _queue_task() for managed_node1/command 18911 1727096286.26160: worker is 1 (out of 1 available) 18911 1727096286.26173: exiting _queue_task() for managed_node1/command 18911 1727096286.26186: done queuing things up, now waiting for results queue to drain 18911 1727096286.26188: waiting for pending results... 18911 1727096286.26441: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18911 1727096286.26446: in run() - task 0afff68d-5257-09a7-aae1-000000000115 18911 1727096286.26450: variable 'ansible_search_path' from source: unknown 18911 1727096286.26453: variable 'ansible_search_path' from source: unknown 18911 1727096286.26456: calling self._execute() 18911 1727096286.26514: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.26517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.26533: variable 'omit' from source: magic vars 18911 1727096286.26803: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.26812: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.26818: variable 'omit' from source: magic vars 18911 1727096286.26849: variable 'omit' from source: magic vars 18911 1727096286.26881: variable 'omit' from source: magic vars 18911 1727096286.27076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096286.27080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096286.27083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096286.27085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.27087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.27091: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096286.27094: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.27096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.27183: Set connection var ansible_shell_executable to /bin/sh 18911 1727096286.27195: Set connection var ansible_timeout to 10 18911 1727096286.27203: Set connection var ansible_shell_type to sh 18911 1727096286.27216: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096286.27226: Set connection var ansible_pipelining to False 18911 1727096286.27236: Set connection var ansible_connection to ssh 18911 1727096286.27265: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.27277: variable 'ansible_connection' from source: unknown 18911 1727096286.27285: variable 'ansible_module_compression' from source: unknown 18911 1727096286.27292: variable 'ansible_shell_type' from source: unknown 18911 1727096286.27299: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.27307: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.27315: variable 'ansible_pipelining' from source: unknown 18911 1727096286.27321: variable 'ansible_timeout' from source: unknown 18911 1727096286.27327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.27468: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096286.27484: variable 'omit' from source: magic vars 18911 1727096286.27493: starting attempt loop 18911 1727096286.27500: running the handler 18911 1727096286.27519: _low_level_execute_command(): starting 18911 1727096286.27533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096286.28151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.28165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096286.28185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.28196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.28207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.28220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.28229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.28284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.28298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.28384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.30094: stdout chunk (state=3): >>>/root <<< 18911 1727096286.30184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.30218: stderr chunk (state=3): >>><<< 18911 1727096286.30223: stdout chunk (state=3): >>><<< 18911 1727096286.30247: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.30258: _low_level_execute_command(): starting 18911 1727096286.30269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367 `" && echo ansible-tmp-1727096286.3024685-19173-211568984753367="` echo /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367 `" ) && sleep 0' 18911 1727096286.30734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.30738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096286.30747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096286.30750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096286.30752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.30802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.30806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.30808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.30883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.32838: stdout chunk (state=3): >>>ansible-tmp-1727096286.3024685-19173-211568984753367=/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367 <<< 18911 1727096286.32943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.32974: stderr chunk (state=3): >>><<< 18911 1727096286.32978: stdout chunk (state=3): >>><<< 18911 1727096286.32994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096286.3024685-19173-211568984753367=/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.33026: variable 'ansible_module_compression' from source: unknown 18911 1727096286.33111: ANSIBALLZ: Using generic lock for ansible.legacy.command 18911 1727096286.33114: ANSIBALLZ: Acquiring lock 18911 1727096286.33117: ANSIBALLZ: Lock acquired: 140481135532592 18911 1727096286.33119: ANSIBALLZ: Creating module 18911 1727096286.41880: ANSIBALLZ: Writing module into payload 18911 1727096286.41942: ANSIBALLZ: Writing module 18911 1727096286.41959: ANSIBALLZ: Renaming module 18911 1727096286.41965: ANSIBALLZ: Done creating module 18911 1727096286.41981: variable 'ansible_facts' from source: unknown 18911 1727096286.42024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py 18911 1727096286.42124: Sending initial data 18911 1727096286.42127: Sent initial data (156 bytes) 18911 1727096286.42588: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.42592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096286.42594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096286.42596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.42598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.42649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.42653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.42655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.42729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.44416: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096286.44481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096286.44552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmppjmk3gp6 /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py <<< 18911 1727096286.44555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py" <<< 18911 1727096286.44622: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmppjmk3gp6" to remote "/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py" <<< 18911 1727096286.44625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py" <<< 18911 1727096286.45543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.45546: stderr chunk (state=3): >>><<< 18911 1727096286.45549: stdout chunk (state=3): >>><<< 18911 1727096286.45596: done transferring module to remote 18911 1727096286.45599: _low_level_execute_command(): starting 18911 1727096286.45601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/ /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py && sleep 0' 18911 1727096286.46029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.46033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.46048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.46109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.46112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.46174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.48250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.48253: stdout chunk (state=3): >>><<< 18911 1727096286.48255: stderr chunk (state=3): >>><<< 18911 1727096286.48258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.48260: _low_level_execute_command(): starting 18911 1727096286.48265: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/AnsiballZ_command.py && sleep 0' 18911 1727096286.48881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.48988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.49106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.64994: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:58:06.643247", "end": "2024-09-23 08:58:06.646681", "delta": "0:00:00.003434", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096286.66421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.66478: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 18911 1727096286.66545: stdout chunk (state=3): >>><<< 18911 1727096286.66566: stderr chunk (state=3): >>><<< 18911 1727096286.66605: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:58:06.643247", "end": "2024-09-23 08:58:06.646681", "delta": "0:00:00.003434", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096286.66703: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096286.66718: _low_level_execute_command(): starting 18911 1727096286.66731: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096286.3024685-19173-211568984753367/ > /dev/null 2>&1 && sleep 0' 18911 1727096286.67376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.67390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.67408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096286.67415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.67477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.67480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.67542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.69464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.69471: stdout chunk (state=3): >>><<< 18911 1727096286.69474: stderr chunk (state=3): >>><<< 18911 1727096286.69490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.69675: handler run complete 18911 1727096286.69678: Evaluated conditional (False): False 18911 1727096286.69680: attempt loop complete, returning result 18911 1727096286.69683: _execute() done 18911 1727096286.69684: dumping result to json 18911 1727096286.69686: done dumping result, returning 18911 1727096286.69688: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0afff68d-5257-09a7-aae1-000000000115] 18911 1727096286.69690: sending task result for task 0afff68d-5257-09a7-aae1-000000000115 18911 1727096286.69763: done sending task result for task 0afff68d-5257-09a7-aae1-000000000115 18911 1727096286.69766: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003434", "end": "2024-09-23 08:58:06.646681", "rc": 0, "start": "2024-09-23 08:58:06.643247" } STDOUT: bonding_masters eth0 lo 18911 1727096286.69856: no more pending results, returning what we have 18911 1727096286.69859: results queue empty 18911 1727096286.69860: checking for any_errors_fatal 18911 1727096286.69862: done checking for any_errors_fatal 18911 1727096286.69862: checking for max_fail_percentage 18911 1727096286.69864: done checking for max_fail_percentage 18911 1727096286.69864: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.69865: done checking to see if all hosts have failed 18911 1727096286.69866: getting the remaining hosts for this loop 18911 1727096286.69869: done getting the remaining hosts for this loop 18911 1727096286.69873: getting the next task for host managed_node1 18911 1727096286.69902: done getting next task for host managed_node1 18911 1727096286.69905: ^ task is: TASK: Set current_interfaces 18911 1727096286.69908: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.69912: getting variables 18911 1727096286.69913: in VariableManager get_vars() 18911 1727096286.69942: Calling all_inventory to load vars for managed_node1 18911 1727096286.69945: Calling groups_inventory to load vars for managed_node1 18911 1727096286.69948: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.69958: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.69961: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.69963: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.70189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.70333: done with get_vars() 18911 1727096286.70340: done getting variables 18911 1727096286.70385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:58:06 -0400 (0:00:00.444) 0:00:05.818 ****** 18911 1727096286.70406: entering _queue_task() for managed_node1/set_fact 18911 1727096286.70603: worker is 1 (out of 1 available) 18911 1727096286.70615: exiting _queue_task() for managed_node1/set_fact 18911 1727096286.70627: done queuing things up, now waiting for results queue to drain 18911 1727096286.70629: waiting for pending results... 18911 1727096286.70776: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18911 1727096286.70841: in run() - task 0afff68d-5257-09a7-aae1-000000000116 18911 1727096286.70852: variable 'ansible_search_path' from source: unknown 18911 1727096286.70856: variable 'ansible_search_path' from source: unknown 18911 1727096286.70887: calling self._execute() 18911 1727096286.70943: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.70947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.70955: variable 'omit' from source: magic vars 18911 1727096286.71220: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.71230: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.71235: variable 'omit' from source: magic vars 18911 1727096286.71270: variable 'omit' from source: magic vars 18911 1727096286.71344: variable '_current_interfaces' from source: set_fact 18911 1727096286.71392: variable 'omit' from source: magic vars 18911 1727096286.71424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096286.71452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096286.71473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096286.71486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.71495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.71520: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096286.71523: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.71526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.71597: Set connection var ansible_shell_executable to /bin/sh 18911 1727096286.71601: Set connection var ansible_timeout to 10 18911 1727096286.71603: Set connection var ansible_shell_type to sh 18911 1727096286.71610: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096286.71615: Set connection var ansible_pipelining to False 18911 1727096286.71625: Set connection var ansible_connection to ssh 18911 1727096286.71639: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.71642: variable 'ansible_connection' from source: unknown 18911 1727096286.71645: variable 'ansible_module_compression' from source: unknown 18911 1727096286.71647: variable 'ansible_shell_type' from source: unknown 18911 1727096286.71649: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.71651: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.71654: variable 'ansible_pipelining' from source: unknown 18911 1727096286.71658: variable 'ansible_timeout' from source: unknown 18911 1727096286.71662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.71766: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096286.71777: variable 'omit' from source: magic vars 18911 1727096286.71782: starting attempt loop 18911 1727096286.71786: running the handler 18911 1727096286.71795: handler run complete 18911 1727096286.71802: attempt loop complete, returning result 18911 1727096286.71805: _execute() done 18911 1727096286.71807: dumping result to json 18911 1727096286.71810: done dumping result, returning 18911 1727096286.71818: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0afff68d-5257-09a7-aae1-000000000116] 18911 1727096286.71820: sending task result for task 0afff68d-5257-09a7-aae1-000000000116 18911 1727096286.71901: done sending task result for task 0afff68d-5257-09a7-aae1-000000000116 18911 1727096286.71904: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18911 1727096286.72031: no more pending results, returning what we have 18911 1727096286.72034: results queue empty 18911 1727096286.72035: checking for any_errors_fatal 18911 1727096286.72041: done checking for any_errors_fatal 18911 1727096286.72042: checking for max_fail_percentage 18911 1727096286.72043: done checking for max_fail_percentage 18911 1727096286.72044: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.72044: done checking to see if all hosts have failed 18911 1727096286.72045: getting the remaining hosts for this loop 18911 1727096286.72046: done getting the remaining hosts for this loop 18911 1727096286.72049: getting the next task for host managed_node1 18911 1727096286.72055: done getting next task for host managed_node1 18911 1727096286.72057: ^ task is: TASK: Show current_interfaces 18911 1727096286.72060: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.72063: getting variables 18911 1727096286.72065: in VariableManager get_vars() 18911 1727096286.72103: Calling all_inventory to load vars for managed_node1 18911 1727096286.72105: Calling groups_inventory to load vars for managed_node1 18911 1727096286.72108: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.72117: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.72119: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.72122: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.72290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.72486: done with get_vars() 18911 1727096286.72495: done getting variables 18911 1727096286.72547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:58:06 -0400 (0:00:00.021) 0:00:05.840 ****** 18911 1727096286.72584: entering _queue_task() for managed_node1/debug 18911 1727096286.72938: worker is 1 (out of 1 available) 18911 1727096286.72950: exiting _queue_task() for managed_node1/debug 18911 1727096286.72961: done queuing things up, now waiting for results queue to drain 18911 1727096286.72962: waiting for pending results... 18911 1727096286.73195: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18911 1727096286.73274: in run() - task 0afff68d-5257-09a7-aae1-000000000105 18911 1727096286.73278: variable 'ansible_search_path' from source: unknown 18911 1727096286.73281: variable 'ansible_search_path' from source: unknown 18911 1727096286.73299: calling self._execute() 18911 1727096286.73384: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.73400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.73446: variable 'omit' from source: magic vars 18911 1727096286.73796: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.73806: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.73812: variable 'omit' from source: magic vars 18911 1727096286.73852: variable 'omit' from source: magic vars 18911 1727096286.73924: variable 'current_interfaces' from source: set_fact 18911 1727096286.73945: variable 'omit' from source: magic vars 18911 1727096286.73981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096286.74011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096286.74027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096286.74040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.74051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.74077: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096286.74080: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.74083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.74151: Set connection var ansible_shell_executable to /bin/sh 18911 1727096286.74156: Set connection var ansible_timeout to 10 18911 1727096286.74158: Set connection var ansible_shell_type to sh 18911 1727096286.74169: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096286.74174: Set connection var ansible_pipelining to False 18911 1727096286.74179: Set connection var ansible_connection to ssh 18911 1727096286.74196: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.74199: variable 'ansible_connection' from source: unknown 18911 1727096286.74201: variable 'ansible_module_compression' from source: unknown 18911 1727096286.74205: variable 'ansible_shell_type' from source: unknown 18911 1727096286.74207: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.74209: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.74211: variable 'ansible_pipelining' from source: unknown 18911 1727096286.74214: variable 'ansible_timeout' from source: unknown 18911 1727096286.74221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.74321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096286.74329: variable 'omit' from source: magic vars 18911 1727096286.74334: starting attempt loop 18911 1727096286.74337: running the handler 18911 1727096286.74378: handler run complete 18911 1727096286.74388: attempt loop complete, returning result 18911 1727096286.74391: _execute() done 18911 1727096286.74394: dumping result to json 18911 1727096286.74396: done dumping result, returning 18911 1727096286.74404: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0afff68d-5257-09a7-aae1-000000000105] 18911 1727096286.74407: sending task result for task 0afff68d-5257-09a7-aae1-000000000105 18911 1727096286.74488: done sending task result for task 0afff68d-5257-09a7-aae1-000000000105 18911 1727096286.74491: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18911 1727096286.74541: no more pending results, returning what we have 18911 1727096286.74544: results queue empty 18911 1727096286.74545: checking for any_errors_fatal 18911 1727096286.74550: done checking for any_errors_fatal 18911 1727096286.74550: checking for max_fail_percentage 18911 1727096286.74552: done checking for max_fail_percentage 18911 1727096286.74553: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.74553: done checking to see if all hosts have failed 18911 1727096286.74554: getting the remaining hosts for this loop 18911 1727096286.74555: done getting the remaining hosts for this loop 18911 1727096286.74558: getting the next task for host managed_node1 18911 1727096286.74566: done getting next task for host managed_node1 18911 1727096286.74570: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18911 1727096286.74572: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.74575: getting variables 18911 1727096286.74577: in VariableManager get_vars() 18911 1727096286.74609: Calling all_inventory to load vars for managed_node1 18911 1727096286.74613: Calling groups_inventory to load vars for managed_node1 18911 1727096286.74616: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.74626: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.74629: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.74631: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.74797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.74911: done with get_vars() 18911 1727096286.74918: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Monday 23 September 2024 08:58:06 -0400 (0:00:00.023) 0:00:05.864 ****** 18911 1727096286.74981: entering _queue_task() for managed_node1/include_tasks 18911 1727096286.75184: worker is 1 (out of 1 available) 18911 1727096286.75198: exiting _queue_task() for managed_node1/include_tasks 18911 1727096286.75210: done queuing things up, now waiting for results queue to drain 18911 1727096286.75212: waiting for pending results... 18911 1727096286.75387: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 18911 1727096286.75500: in run() - task 0afff68d-5257-09a7-aae1-000000000011 18911 1727096286.75505: variable 'ansible_search_path' from source: unknown 18911 1727096286.75507: calling self._execute() 18911 1727096286.75673: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.75677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.75679: variable 'omit' from source: magic vars 18911 1727096286.75951: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.75969: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.75980: _execute() done 18911 1727096286.75987: dumping result to json 18911 1727096286.75994: done dumping result, returning 18911 1727096286.76004: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-09a7-aae1-000000000011] 18911 1727096286.76012: sending task result for task 0afff68d-5257-09a7-aae1-000000000011 18911 1727096286.76137: no more pending results, returning what we have 18911 1727096286.76142: in VariableManager get_vars() 18911 1727096286.76178: Calling all_inventory to load vars for managed_node1 18911 1727096286.76181: Calling groups_inventory to load vars for managed_node1 18911 1727096286.76184: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.76199: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.76201: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.76206: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.76466: done sending task result for task 0afff68d-5257-09a7-aae1-000000000011 18911 1727096286.76471: WORKER PROCESS EXITING 18911 1727096286.76493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.76682: done with get_vars() 18911 1727096286.76692: variable 'ansible_search_path' from source: unknown 18911 1727096286.76706: we have included files to process 18911 1727096286.76707: generating all_blocks data 18911 1727096286.76709: done generating all_blocks data 18911 1727096286.76714: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18911 1727096286.76715: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18911 1727096286.76718: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18911 1727096286.77266: in VariableManager get_vars() 18911 1727096286.77283: done with get_vars() 18911 1727096286.77516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 18911 1727096286.78123: done processing included file 18911 1727096286.78124: iterating over new_blocks loaded from include file 18911 1727096286.78125: in VariableManager get_vars() 18911 1727096286.78134: done with get_vars() 18911 1727096286.78135: filtering new block on tags 18911 1727096286.78153: done filtering new block on tags 18911 1727096286.78155: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 18911 1727096286.78158: extending task lists for all hosts with included blocks 18911 1727096286.78308: done extending task lists 18911 1727096286.78309: done processing included files 18911 1727096286.78309: results queue empty 18911 1727096286.78310: checking for any_errors_fatal 18911 1727096286.78312: done checking for any_errors_fatal 18911 1727096286.78313: checking for max_fail_percentage 18911 1727096286.78314: done checking for max_fail_percentage 18911 1727096286.78315: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.78315: done checking to see if all hosts have failed 18911 1727096286.78316: getting the remaining hosts for this loop 18911 1727096286.78316: done getting the remaining hosts for this loop 18911 1727096286.78318: getting the next task for host managed_node1 18911 1727096286.78320: done getting next task for host managed_node1 18911 1727096286.78322: ^ task is: TASK: Ensure state in ["present", "absent"] 18911 1727096286.78323: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.78325: getting variables 18911 1727096286.78325: in VariableManager get_vars() 18911 1727096286.78331: Calling all_inventory to load vars for managed_node1 18911 1727096286.78333: Calling groups_inventory to load vars for managed_node1 18911 1727096286.78334: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.78338: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.78340: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.78341: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.78430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.78539: done with get_vars() 18911 1727096286.78546: done getting variables 18911 1727096286.78595: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 08:58:06 -0400 (0:00:00.036) 0:00:05.900 ****** 18911 1727096286.78614: entering _queue_task() for managed_node1/fail 18911 1727096286.78615: Creating lock for fail 18911 1727096286.78859: worker is 1 (out of 1 available) 18911 1727096286.78877: exiting _queue_task() for managed_node1/fail 18911 1727096286.78889: done queuing things up, now waiting for results queue to drain 18911 1727096286.78891: waiting for pending results... 18911 1727096286.79039: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 18911 1727096286.79100: in run() - task 0afff68d-5257-09a7-aae1-000000000131 18911 1727096286.79111: variable 'ansible_search_path' from source: unknown 18911 1727096286.79115: variable 'ansible_search_path' from source: unknown 18911 1727096286.79143: calling self._execute() 18911 1727096286.79204: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.79207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.79217: variable 'omit' from source: magic vars 18911 1727096286.79490: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.79499: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.79592: variable 'state' from source: include params 18911 1727096286.79596: Evaluated conditional (state not in ["present", "absent"]): False 18911 1727096286.79598: when evaluation is False, skipping this task 18911 1727096286.79602: _execute() done 18911 1727096286.79606: dumping result to json 18911 1727096286.79608: done dumping result, returning 18911 1727096286.79615: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-09a7-aae1-000000000131] 18911 1727096286.79620: sending task result for task 0afff68d-5257-09a7-aae1-000000000131 18911 1727096286.79704: done sending task result for task 0afff68d-5257-09a7-aae1-000000000131 18911 1727096286.79707: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 18911 1727096286.79751: no more pending results, returning what we have 18911 1727096286.79755: results queue empty 18911 1727096286.79756: checking for any_errors_fatal 18911 1727096286.79758: done checking for any_errors_fatal 18911 1727096286.79758: checking for max_fail_percentage 18911 1727096286.79760: done checking for max_fail_percentage 18911 1727096286.79763: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.79763: done checking to see if all hosts have failed 18911 1727096286.79764: getting the remaining hosts for this loop 18911 1727096286.79765: done getting the remaining hosts for this loop 18911 1727096286.79770: getting the next task for host managed_node1 18911 1727096286.79776: done getting next task for host managed_node1 18911 1727096286.79779: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 18911 1727096286.79782: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.79785: getting variables 18911 1727096286.79787: in VariableManager get_vars() 18911 1727096286.79815: Calling all_inventory to load vars for managed_node1 18911 1727096286.79818: Calling groups_inventory to load vars for managed_node1 18911 1727096286.79821: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.79833: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.79835: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.79838: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.80090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.80419: done with get_vars() 18911 1727096286.80429: done getting variables 18911 1727096286.80509: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 08:58:06 -0400 (0:00:00.019) 0:00:05.919 ****** 18911 1727096286.80539: entering _queue_task() for managed_node1/fail 18911 1727096286.80839: worker is 1 (out of 1 available) 18911 1727096286.80851: exiting _queue_task() for managed_node1/fail 18911 1727096286.80864: done queuing things up, now waiting for results queue to drain 18911 1727096286.80865: waiting for pending results... 18911 1727096286.81197: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 18911 1727096286.81215: in run() - task 0afff68d-5257-09a7-aae1-000000000132 18911 1727096286.81234: variable 'ansible_search_path' from source: unknown 18911 1727096286.81274: variable 'ansible_search_path' from source: unknown 18911 1727096286.81289: calling self._execute() 18911 1727096286.81373: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.81385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.81523: variable 'omit' from source: magic vars 18911 1727096286.81904: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.81913: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.82013: variable 'type' from source: set_fact 18911 1727096286.82017: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 18911 1727096286.82020: when evaluation is False, skipping this task 18911 1727096286.82022: _execute() done 18911 1727096286.82027: dumping result to json 18911 1727096286.82029: done dumping result, returning 18911 1727096286.82037: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-09a7-aae1-000000000132] 18911 1727096286.82041: sending task result for task 0afff68d-5257-09a7-aae1-000000000132 18911 1727096286.82135: done sending task result for task 0afff68d-5257-09a7-aae1-000000000132 18911 1727096286.82138: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 18911 1727096286.82220: no more pending results, returning what we have 18911 1727096286.82223: results queue empty 18911 1727096286.82224: checking for any_errors_fatal 18911 1727096286.82232: done checking for any_errors_fatal 18911 1727096286.82232: checking for max_fail_percentage 18911 1727096286.82234: done checking for max_fail_percentage 18911 1727096286.82235: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.82235: done checking to see if all hosts have failed 18911 1727096286.82236: getting the remaining hosts for this loop 18911 1727096286.82238: done getting the remaining hosts for this loop 18911 1727096286.82241: getting the next task for host managed_node1 18911 1727096286.82247: done getting next task for host managed_node1 18911 1727096286.82249: ^ task is: TASK: Include the task 'show_interfaces.yml' 18911 1727096286.82252: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.82255: getting variables 18911 1727096286.82257: in VariableManager get_vars() 18911 1727096286.82284: Calling all_inventory to load vars for managed_node1 18911 1727096286.82286: Calling groups_inventory to load vars for managed_node1 18911 1727096286.82289: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.82298: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.82301: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.82303: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.82429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.82542: done with get_vars() 18911 1727096286.82550: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 08:58:06 -0400 (0:00:00.020) 0:00:05.940 ****** 18911 1727096286.82617: entering _queue_task() for managed_node1/include_tasks 18911 1727096286.82823: worker is 1 (out of 1 available) 18911 1727096286.82835: exiting _queue_task() for managed_node1/include_tasks 18911 1727096286.82848: done queuing things up, now waiting for results queue to drain 18911 1727096286.82849: waiting for pending results... 18911 1727096286.83002: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18911 1727096286.83064: in run() - task 0afff68d-5257-09a7-aae1-000000000133 18911 1727096286.83082: variable 'ansible_search_path' from source: unknown 18911 1727096286.83086: variable 'ansible_search_path' from source: unknown 18911 1727096286.83110: calling self._execute() 18911 1727096286.83163: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.83172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.83185: variable 'omit' from source: magic vars 18911 1727096286.83497: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.83509: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.83514: _execute() done 18911 1727096286.83518: dumping result to json 18911 1727096286.83520: done dumping result, returning 18911 1727096286.83523: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-09a7-aae1-000000000133] 18911 1727096286.83529: sending task result for task 0afff68d-5257-09a7-aae1-000000000133 18911 1727096286.83641: done sending task result for task 0afff68d-5257-09a7-aae1-000000000133 18911 1727096286.83643: WORKER PROCESS EXITING 18911 1727096286.83724: no more pending results, returning what we have 18911 1727096286.83728: in VariableManager get_vars() 18911 1727096286.83755: Calling all_inventory to load vars for managed_node1 18911 1727096286.83758: Calling groups_inventory to load vars for managed_node1 18911 1727096286.83760: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.83772: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.83775: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.83777: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.84211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.84444: done with get_vars() 18911 1727096286.84451: variable 'ansible_search_path' from source: unknown 18911 1727096286.84452: variable 'ansible_search_path' from source: unknown 18911 1727096286.84492: we have included files to process 18911 1727096286.84493: generating all_blocks data 18911 1727096286.84494: done generating all_blocks data 18911 1727096286.84499: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.84500: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.84502: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18911 1727096286.84605: in VariableManager get_vars() 18911 1727096286.84621: done with get_vars() 18911 1727096286.84728: done processing included file 18911 1727096286.84730: iterating over new_blocks loaded from include file 18911 1727096286.84732: in VariableManager get_vars() 18911 1727096286.84745: done with get_vars() 18911 1727096286.84746: filtering new block on tags 18911 1727096286.84766: done filtering new block on tags 18911 1727096286.84770: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18911 1727096286.84775: extending task lists for all hosts with included blocks 18911 1727096286.85175: done extending task lists 18911 1727096286.85177: done processing included files 18911 1727096286.85178: results queue empty 18911 1727096286.85178: checking for any_errors_fatal 18911 1727096286.85180: done checking for any_errors_fatal 18911 1727096286.85181: checking for max_fail_percentage 18911 1727096286.85182: done checking for max_fail_percentage 18911 1727096286.85183: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.85184: done checking to see if all hosts have failed 18911 1727096286.85184: getting the remaining hosts for this loop 18911 1727096286.85186: done getting the remaining hosts for this loop 18911 1727096286.85188: getting the next task for host managed_node1 18911 1727096286.85192: done getting next task for host managed_node1 18911 1727096286.85194: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18911 1727096286.85197: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.85199: getting variables 18911 1727096286.85200: in VariableManager get_vars() 18911 1727096286.85208: Calling all_inventory to load vars for managed_node1 18911 1727096286.85211: Calling groups_inventory to load vars for managed_node1 18911 1727096286.85213: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.85218: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.85220: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.85223: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.85395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.85587: done with get_vars() 18911 1727096286.85596: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:58:06 -0400 (0:00:00.030) 0:00:05.971 ****** 18911 1727096286.85670: entering _queue_task() for managed_node1/include_tasks 18911 1727096286.85975: worker is 1 (out of 1 available) 18911 1727096286.85987: exiting _queue_task() for managed_node1/include_tasks 18911 1727096286.85998: done queuing things up, now waiting for results queue to drain 18911 1727096286.86000: waiting for pending results... 18911 1727096286.86387: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18911 1727096286.86392: in run() - task 0afff68d-5257-09a7-aae1-00000000015c 18911 1727096286.86395: variable 'ansible_search_path' from source: unknown 18911 1727096286.86397: variable 'ansible_search_path' from source: unknown 18911 1727096286.86421: calling self._execute() 18911 1727096286.86506: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.86516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.86529: variable 'omit' from source: magic vars 18911 1727096286.86899: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.86923: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.86933: _execute() done 18911 1727096286.86941: dumping result to json 18911 1727096286.86948: done dumping result, returning 18911 1727096286.86957: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-09a7-aae1-00000000015c] 18911 1727096286.86971: sending task result for task 0afff68d-5257-09a7-aae1-00000000015c 18911 1727096286.87275: done sending task result for task 0afff68d-5257-09a7-aae1-00000000015c 18911 1727096286.87279: WORKER PROCESS EXITING 18911 1727096286.87303: no more pending results, returning what we have 18911 1727096286.87307: in VariableManager get_vars() 18911 1727096286.87339: Calling all_inventory to load vars for managed_node1 18911 1727096286.87342: Calling groups_inventory to load vars for managed_node1 18911 1727096286.87345: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.87357: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.87360: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.87367: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.87548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.87829: done with get_vars() 18911 1727096286.87838: variable 'ansible_search_path' from source: unknown 18911 1727096286.87839: variable 'ansible_search_path' from source: unknown 18911 1727096286.87899: we have included files to process 18911 1727096286.87901: generating all_blocks data 18911 1727096286.87902: done generating all_blocks data 18911 1727096286.87903: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.87904: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.87906: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18911 1727096286.88165: done processing included file 18911 1727096286.88169: iterating over new_blocks loaded from include file 18911 1727096286.88171: in VariableManager get_vars() 18911 1727096286.88185: done with get_vars() 18911 1727096286.88187: filtering new block on tags 18911 1727096286.88205: done filtering new block on tags 18911 1727096286.88208: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18911 1727096286.88212: extending task lists for all hosts with included blocks 18911 1727096286.88358: done extending task lists 18911 1727096286.88360: done processing included files 18911 1727096286.88363: results queue empty 18911 1727096286.88364: checking for any_errors_fatal 18911 1727096286.88369: done checking for any_errors_fatal 18911 1727096286.88370: checking for max_fail_percentage 18911 1727096286.88371: done checking for max_fail_percentage 18911 1727096286.88371: checking to see if all hosts have failed and the running result is not ok 18911 1727096286.88372: done checking to see if all hosts have failed 18911 1727096286.88373: getting the remaining hosts for this loop 18911 1727096286.88374: done getting the remaining hosts for this loop 18911 1727096286.88376: getting the next task for host managed_node1 18911 1727096286.88380: done getting next task for host managed_node1 18911 1727096286.88382: ^ task is: TASK: Gather current interface info 18911 1727096286.88385: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096286.88387: getting variables 18911 1727096286.88388: in VariableManager get_vars() 18911 1727096286.88397: Calling all_inventory to load vars for managed_node1 18911 1727096286.88399: Calling groups_inventory to load vars for managed_node1 18911 1727096286.88402: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096286.88407: Calling all_plugins_play to load vars for managed_node1 18911 1727096286.88409: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096286.88412: Calling groups_plugins_play to load vars for managed_node1 18911 1727096286.88585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096286.88775: done with get_vars() 18911 1727096286.88784: done getting variables 18911 1727096286.88821: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:58:06 -0400 (0:00:00.031) 0:00:06.002 ****** 18911 1727096286.88850: entering _queue_task() for managed_node1/command 18911 1727096286.89148: worker is 1 (out of 1 available) 18911 1727096286.89163: exiting _queue_task() for managed_node1/command 18911 1727096286.89377: done queuing things up, now waiting for results queue to drain 18911 1727096286.89379: waiting for pending results... 18911 1727096286.89424: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18911 1727096286.89545: in run() - task 0afff68d-5257-09a7-aae1-000000000193 18911 1727096286.89572: variable 'ansible_search_path' from source: unknown 18911 1727096286.89580: variable 'ansible_search_path' from source: unknown 18911 1727096286.89622: calling self._execute() 18911 1727096286.89703: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.89720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.89735: variable 'omit' from source: magic vars 18911 1727096286.90109: variable 'ansible_distribution_major_version' from source: facts 18911 1727096286.90149: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096286.90152: variable 'omit' from source: magic vars 18911 1727096286.90199: variable 'omit' from source: magic vars 18911 1727096286.90238: variable 'omit' from source: magic vars 18911 1727096286.90370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096286.90373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096286.90376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096286.90382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.90398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096286.90431: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096286.90440: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.90448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.90553: Set connection var ansible_shell_executable to /bin/sh 18911 1727096286.90569: Set connection var ansible_timeout to 10 18911 1727096286.90579: Set connection var ansible_shell_type to sh 18911 1727096286.90593: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096286.90603: Set connection var ansible_pipelining to False 18911 1727096286.90612: Set connection var ansible_connection to ssh 18911 1727096286.90638: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.90646: variable 'ansible_connection' from source: unknown 18911 1727096286.90653: variable 'ansible_module_compression' from source: unknown 18911 1727096286.90695: variable 'ansible_shell_type' from source: unknown 18911 1727096286.90698: variable 'ansible_shell_executable' from source: unknown 18911 1727096286.90700: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096286.90702: variable 'ansible_pipelining' from source: unknown 18911 1727096286.90704: variable 'ansible_timeout' from source: unknown 18911 1727096286.90706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096286.90840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096286.90855: variable 'omit' from source: magic vars 18911 1727096286.90870: starting attempt loop 18911 1727096286.90911: running the handler 18911 1727096286.90914: _low_level_execute_command(): starting 18911 1727096286.90916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096286.91655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.91682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.91793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.91821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.91915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.93656: stdout chunk (state=3): >>>/root <<< 18911 1727096286.93797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.93812: stdout chunk (state=3): >>><<< 18911 1727096286.93829: stderr chunk (state=3): >>><<< 18911 1727096286.93849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.93949: _low_level_execute_command(): starting 18911 1727096286.93953: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820 `" && echo ansible-tmp-1727096286.938552-19216-190346714099820="` echo /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820 `" ) && sleep 0' 18911 1727096286.94515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.94530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096286.94544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096286.94560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096286.94581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096286.94621: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.94700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.94751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.94754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.94850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.96852: stdout chunk (state=3): >>>ansible-tmp-1727096286.938552-19216-190346714099820=/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820 <<< 18911 1727096286.97013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096286.97016: stdout chunk (state=3): >>><<< 18911 1727096286.97019: stderr chunk (state=3): >>><<< 18911 1727096286.97042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096286.938552-19216-190346714099820=/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096286.97172: variable 'ansible_module_compression' from source: unknown 18911 1727096286.97175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096286.97178: variable 'ansible_facts' from source: unknown 18911 1727096286.97256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py 18911 1727096286.97421: Sending initial data 18911 1727096286.97430: Sent initial data (155 bytes) 18911 1727096286.98091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096286.98184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096286.98216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096286.98234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096286.98258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096286.98357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096286.99983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096287.00038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096287.00112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpg5_6bole /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py <<< 18911 1727096287.00115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py" <<< 18911 1727096287.00176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpg5_6bole" to remote "/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py" <<< 18911 1727096287.01207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.01210: stdout chunk (state=3): >>><<< 18911 1727096287.01213: stderr chunk (state=3): >>><<< 18911 1727096287.01215: done transferring module to remote 18911 1727096287.01217: _low_level_execute_command(): starting 18911 1727096287.01219: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/ /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py && sleep 0' 18911 1727096287.01780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096287.01796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096287.01887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.01917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.01936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.01957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.02055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.03986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.04020: stdout chunk (state=3): >>><<< 18911 1727096287.04023: stderr chunk (state=3): >>><<< 18911 1727096287.04073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096287.04076: _low_level_execute_command(): starting 18911 1727096287.04079: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/AnsiballZ_command.py && sleep 0' 18911 1727096287.04712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.04800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.04803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.04888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.20566: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:58:07.200943", "end": "2024-09-23 08:58:07.204304", "delta": "0:00:00.003361", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096287.22118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096287.22144: stderr chunk (state=3): >>><<< 18911 1727096287.22147: stdout chunk (state=3): >>><<< 18911 1727096287.22174: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:58:07.200943", "end": "2024-09-23 08:58:07.204304", "delta": "0:00:00.003361", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096287.22198: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096287.22205: _low_level_execute_command(): starting 18911 1727096287.22210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096286.938552-19216-190346714099820/ > /dev/null 2>&1 && sleep 0' 18911 1727096287.22671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096287.22675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.22678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096287.22680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.22733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.22737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.22743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.22812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.24711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.24741: stderr chunk (state=3): >>><<< 18911 1727096287.24744: stdout chunk (state=3): >>><<< 18911 1727096287.24758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096287.24768: handler run complete 18911 1727096287.24788: Evaluated conditional (False): False 18911 1727096287.24796: attempt loop complete, returning result 18911 1727096287.24799: _execute() done 18911 1727096287.24801: dumping result to json 18911 1727096287.24807: done dumping result, returning 18911 1727096287.24814: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0afff68d-5257-09a7-aae1-000000000193] 18911 1727096287.24819: sending task result for task 0afff68d-5257-09a7-aae1-000000000193 18911 1727096287.24918: done sending task result for task 0afff68d-5257-09a7-aae1-000000000193 18911 1727096287.24921: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003361", "end": "2024-09-23 08:58:07.204304", "rc": 0, "start": "2024-09-23 08:58:07.200943" } STDOUT: bonding_masters eth0 lo 18911 1727096287.25008: no more pending results, returning what we have 18911 1727096287.25012: results queue empty 18911 1727096287.25013: checking for any_errors_fatal 18911 1727096287.25014: done checking for any_errors_fatal 18911 1727096287.25015: checking for max_fail_percentage 18911 1727096287.25016: done checking for max_fail_percentage 18911 1727096287.25017: checking to see if all hosts have failed and the running result is not ok 18911 1727096287.25018: done checking to see if all hosts have failed 18911 1727096287.25018: getting the remaining hosts for this loop 18911 1727096287.25019: done getting the remaining hosts for this loop 18911 1727096287.25023: getting the next task for host managed_node1 18911 1727096287.25030: done getting next task for host managed_node1 18911 1727096287.25032: ^ task is: TASK: Set current_interfaces 18911 1727096287.25037: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096287.25042: getting variables 18911 1727096287.25043: in VariableManager get_vars() 18911 1727096287.25074: Calling all_inventory to load vars for managed_node1 18911 1727096287.25077: Calling groups_inventory to load vars for managed_node1 18911 1727096287.25081: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096287.25091: Calling all_plugins_play to load vars for managed_node1 18911 1727096287.25093: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096287.25095: Calling groups_plugins_play to load vars for managed_node1 18911 1727096287.25250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096287.25389: done with get_vars() 18911 1727096287.25397: done getting variables 18911 1727096287.25439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:58:07 -0400 (0:00:00.366) 0:00:06.369 ****** 18911 1727096287.25462: entering _queue_task() for managed_node1/set_fact 18911 1727096287.25678: worker is 1 (out of 1 available) 18911 1727096287.25692: exiting _queue_task() for managed_node1/set_fact 18911 1727096287.25704: done queuing things up, now waiting for results queue to drain 18911 1727096287.25705: waiting for pending results... 18911 1727096287.25856: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18911 1727096287.25936: in run() - task 0afff68d-5257-09a7-aae1-000000000194 18911 1727096287.25945: variable 'ansible_search_path' from source: unknown 18911 1727096287.25948: variable 'ansible_search_path' from source: unknown 18911 1727096287.25981: calling self._execute() 18911 1727096287.26039: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.26043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.26051: variable 'omit' from source: magic vars 18911 1727096287.26324: variable 'ansible_distribution_major_version' from source: facts 18911 1727096287.26334: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096287.26339: variable 'omit' from source: magic vars 18911 1727096287.26376: variable 'omit' from source: magic vars 18911 1727096287.26449: variable '_current_interfaces' from source: set_fact 18911 1727096287.26502: variable 'omit' from source: magic vars 18911 1727096287.26536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096287.26562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096287.26585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096287.26599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.26609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.26632: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096287.26635: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.26638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.26712: Set connection var ansible_shell_executable to /bin/sh 18911 1727096287.26715: Set connection var ansible_timeout to 10 18911 1727096287.26718: Set connection var ansible_shell_type to sh 18911 1727096287.26725: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096287.26731: Set connection var ansible_pipelining to False 18911 1727096287.26734: Set connection var ansible_connection to ssh 18911 1727096287.26751: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.26754: variable 'ansible_connection' from source: unknown 18911 1727096287.26757: variable 'ansible_module_compression' from source: unknown 18911 1727096287.26759: variable 'ansible_shell_type' from source: unknown 18911 1727096287.26761: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.26766: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.26772: variable 'ansible_pipelining' from source: unknown 18911 1727096287.26774: variable 'ansible_timeout' from source: unknown 18911 1727096287.26778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.26880: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096287.26888: variable 'omit' from source: magic vars 18911 1727096287.26893: starting attempt loop 18911 1727096287.26896: running the handler 18911 1727096287.26905: handler run complete 18911 1727096287.26914: attempt loop complete, returning result 18911 1727096287.26917: _execute() done 18911 1727096287.26919: dumping result to json 18911 1727096287.26921: done dumping result, returning 18911 1727096287.26930: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0afff68d-5257-09a7-aae1-000000000194] 18911 1727096287.26932: sending task result for task 0afff68d-5257-09a7-aae1-000000000194 18911 1727096287.27010: done sending task result for task 0afff68d-5257-09a7-aae1-000000000194 18911 1727096287.27013: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18911 1727096287.27089: no more pending results, returning what we have 18911 1727096287.27092: results queue empty 18911 1727096287.27093: checking for any_errors_fatal 18911 1727096287.27103: done checking for any_errors_fatal 18911 1727096287.27104: checking for max_fail_percentage 18911 1727096287.27105: done checking for max_fail_percentage 18911 1727096287.27106: checking to see if all hosts have failed and the running result is not ok 18911 1727096287.27106: done checking to see if all hosts have failed 18911 1727096287.27107: getting the remaining hosts for this loop 18911 1727096287.27108: done getting the remaining hosts for this loop 18911 1727096287.27112: getting the next task for host managed_node1 18911 1727096287.27120: done getting next task for host managed_node1 18911 1727096287.27122: ^ task is: TASK: Show current_interfaces 18911 1727096287.27126: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096287.27129: getting variables 18911 1727096287.27130: in VariableManager get_vars() 18911 1727096287.27159: Calling all_inventory to load vars for managed_node1 18911 1727096287.27161: Calling groups_inventory to load vars for managed_node1 18911 1727096287.27164: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096287.27175: Calling all_plugins_play to load vars for managed_node1 18911 1727096287.27182: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096287.27185: Calling groups_plugins_play to load vars for managed_node1 18911 1727096287.27326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096287.27441: done with get_vars() 18911 1727096287.27448: done getting variables 18911 1727096287.27491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:58:07 -0400 (0:00:00.020) 0:00:06.389 ****** 18911 1727096287.27513: entering _queue_task() for managed_node1/debug 18911 1727096287.27719: worker is 1 (out of 1 available) 18911 1727096287.27732: exiting _queue_task() for managed_node1/debug 18911 1727096287.27744: done queuing things up, now waiting for results queue to drain 18911 1727096287.27745: waiting for pending results... 18911 1727096287.27886: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18911 1727096287.27952: in run() - task 0afff68d-5257-09a7-aae1-00000000015d 18911 1727096287.27963: variable 'ansible_search_path' from source: unknown 18911 1727096287.27975: variable 'ansible_search_path' from source: unknown 18911 1727096287.28000: calling self._execute() 18911 1727096287.28056: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.28060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.28073: variable 'omit' from source: magic vars 18911 1727096287.28332: variable 'ansible_distribution_major_version' from source: facts 18911 1727096287.28342: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096287.28347: variable 'omit' from source: magic vars 18911 1727096287.28381: variable 'omit' from source: magic vars 18911 1727096287.28450: variable 'current_interfaces' from source: set_fact 18911 1727096287.28474: variable 'omit' from source: magic vars 18911 1727096287.28505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096287.28534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096287.28550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096287.28562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.28575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.28598: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096287.28601: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.28605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.28678: Set connection var ansible_shell_executable to /bin/sh 18911 1727096287.28681: Set connection var ansible_timeout to 10 18911 1727096287.28683: Set connection var ansible_shell_type to sh 18911 1727096287.28690: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096287.28695: Set connection var ansible_pipelining to False 18911 1727096287.28700: Set connection var ansible_connection to ssh 18911 1727096287.28717: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.28720: variable 'ansible_connection' from source: unknown 18911 1727096287.28722: variable 'ansible_module_compression' from source: unknown 18911 1727096287.28724: variable 'ansible_shell_type' from source: unknown 18911 1727096287.28726: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.28730: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.28733: variable 'ansible_pipelining' from source: unknown 18911 1727096287.28735: variable 'ansible_timeout' from source: unknown 18911 1727096287.28742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.28842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096287.28853: variable 'omit' from source: magic vars 18911 1727096287.28856: starting attempt loop 18911 1727096287.28859: running the handler 18911 1727096287.28898: handler run complete 18911 1727096287.28909: attempt loop complete, returning result 18911 1727096287.28912: _execute() done 18911 1727096287.28915: dumping result to json 18911 1727096287.28919: done dumping result, returning 18911 1727096287.28925: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0afff68d-5257-09a7-aae1-00000000015d] 18911 1727096287.28930: sending task result for task 0afff68d-5257-09a7-aae1-00000000015d 18911 1727096287.29011: done sending task result for task 0afff68d-5257-09a7-aae1-00000000015d 18911 1727096287.29013: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18911 1727096287.29058: no more pending results, returning what we have 18911 1727096287.29061: results queue empty 18911 1727096287.29062: checking for any_errors_fatal 18911 1727096287.29066: done checking for any_errors_fatal 18911 1727096287.29069: checking for max_fail_percentage 18911 1727096287.29070: done checking for max_fail_percentage 18911 1727096287.29071: checking to see if all hosts have failed and the running result is not ok 18911 1727096287.29071: done checking to see if all hosts have failed 18911 1727096287.29072: getting the remaining hosts for this loop 18911 1727096287.29073: done getting the remaining hosts for this loop 18911 1727096287.29077: getting the next task for host managed_node1 18911 1727096287.29085: done getting next task for host managed_node1 18911 1727096287.29087: ^ task is: TASK: Install iproute 18911 1727096287.29090: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096287.29093: getting variables 18911 1727096287.29095: in VariableManager get_vars() 18911 1727096287.29121: Calling all_inventory to load vars for managed_node1 18911 1727096287.29124: Calling groups_inventory to load vars for managed_node1 18911 1727096287.29127: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096287.29136: Calling all_plugins_play to load vars for managed_node1 18911 1727096287.29139: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096287.29141: Calling groups_plugins_play to load vars for managed_node1 18911 1727096287.29313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096287.29424: done with get_vars() 18911 1727096287.29431: done getting variables 18911 1727096287.29473: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 08:58:07 -0400 (0:00:00.019) 0:00:06.409 ****** 18911 1727096287.29493: entering _queue_task() for managed_node1/package 18911 1727096287.29691: worker is 1 (out of 1 available) 18911 1727096287.29705: exiting _queue_task() for managed_node1/package 18911 1727096287.29717: done queuing things up, now waiting for results queue to drain 18911 1727096287.29719: waiting for pending results... 18911 1727096287.29862: running TaskExecutor() for managed_node1/TASK: Install iproute 18911 1727096287.29926: in run() - task 0afff68d-5257-09a7-aae1-000000000134 18911 1727096287.29937: variable 'ansible_search_path' from source: unknown 18911 1727096287.29943: variable 'ansible_search_path' from source: unknown 18911 1727096287.29973: calling self._execute() 18911 1727096287.30025: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.30028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.30037: variable 'omit' from source: magic vars 18911 1727096287.30302: variable 'ansible_distribution_major_version' from source: facts 18911 1727096287.30312: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096287.30317: variable 'omit' from source: magic vars 18911 1727096287.30342: variable 'omit' from source: magic vars 18911 1727096287.30473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096287.32114: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096287.32174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096287.32202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096287.32228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096287.32249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096287.32322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096287.32347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096287.32362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096287.32392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096287.32403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096287.32479: variable '__network_is_ostree' from source: set_fact 18911 1727096287.32483: variable 'omit' from source: magic vars 18911 1727096287.32507: variable 'omit' from source: magic vars 18911 1727096287.32529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096287.32549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096287.32568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096287.32589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.32672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096287.32675: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096287.32678: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.32680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.32752: Set connection var ansible_shell_executable to /bin/sh 18911 1727096287.32765: Set connection var ansible_timeout to 10 18911 1727096287.32776: Set connection var ansible_shell_type to sh 18911 1727096287.32788: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096287.32798: Set connection var ansible_pipelining to False 18911 1727096287.32807: Set connection var ansible_connection to ssh 18911 1727096287.32835: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.32844: variable 'ansible_connection' from source: unknown 18911 1727096287.32851: variable 'ansible_module_compression' from source: unknown 18911 1727096287.32859: variable 'ansible_shell_type' from source: unknown 18911 1727096287.32872: variable 'ansible_shell_executable' from source: unknown 18911 1727096287.32880: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096287.32887: variable 'ansible_pipelining' from source: unknown 18911 1727096287.32893: variable 'ansible_timeout' from source: unknown 18911 1727096287.33076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096287.33080: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096287.33082: variable 'omit' from source: magic vars 18911 1727096287.33084: starting attempt loop 18911 1727096287.33086: running the handler 18911 1727096287.33088: variable 'ansible_facts' from source: unknown 18911 1727096287.33090: variable 'ansible_facts' from source: unknown 18911 1727096287.33091: _low_level_execute_command(): starting 18911 1727096287.33099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096287.33695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096287.33713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096287.33725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.33766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.33795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.33875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.35593: stdout chunk (state=3): >>>/root <<< 18911 1727096287.35686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.35715: stderr chunk (state=3): >>><<< 18911 1727096287.35719: stdout chunk (state=3): >>><<< 18911 1727096287.35739: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096287.35750: _low_level_execute_command(): starting 18911 1727096287.35758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058 `" && echo ansible-tmp-1727096287.3573942-19248-231913055323058="` echo /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058 `" ) && sleep 0' 18911 1727096287.36201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096287.36205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.36218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.36278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.36281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.36353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.38316: stdout chunk (state=3): >>>ansible-tmp-1727096287.3573942-19248-231913055323058=/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058 <<< 18911 1727096287.38443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.38446: stdout chunk (state=3): >>><<< 18911 1727096287.38453: stderr chunk (state=3): >>><<< 18911 1727096287.38469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096287.3573942-19248-231913055323058=/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096287.38494: variable 'ansible_module_compression' from source: unknown 18911 1727096287.38541: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 18911 1727096287.38544: ANSIBALLZ: Acquiring lock 18911 1727096287.38547: ANSIBALLZ: Lock acquired: 140481135532592 18911 1727096287.38549: ANSIBALLZ: Creating module 18911 1727096287.49494: ANSIBALLZ: Writing module into payload 18911 1727096287.49643: ANSIBALLZ: Writing module 18911 1727096287.49670: ANSIBALLZ: Renaming module 18911 1727096287.49681: ANSIBALLZ: Done creating module 18911 1727096287.49697: variable 'ansible_facts' from source: unknown 18911 1727096287.49756: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py 18911 1727096287.49859: Sending initial data 18911 1727096287.49866: Sent initial data (152 bytes) 18911 1727096287.50315: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096287.50319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.50322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096287.50324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096287.50326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.50374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.50377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.50393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.50462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.52140: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096287.52205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096287.52594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpx5_mcso9 /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py <<< 18911 1727096287.52597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpx5_mcso9" to remote "/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py" <<< 18911 1727096287.53668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.53713: stderr chunk (state=3): >>><<< 18911 1727096287.53722: stdout chunk (state=3): >>><<< 18911 1727096287.53746: done transferring module to remote 18911 1727096287.53768: _low_level_execute_command(): starting 18911 1727096287.53780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/ /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py && sleep 0' 18911 1727096287.54474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.54499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.54591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.56496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096287.56519: stdout chunk (state=3): >>><<< 18911 1727096287.56522: stderr chunk (state=3): >>><<< 18911 1727096287.56536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096287.56619: _low_level_execute_command(): starting 18911 1727096287.56622: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/AnsiballZ_dnf.py && sleep 0' 18911 1727096287.57172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096287.57189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096287.57210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096287.57231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096287.57283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096287.57342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096287.57369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096287.57383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096287.57486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096287.99299: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 18911 1727096288.03811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096288.03816: stdout chunk (state=3): >>><<< 18911 1727096288.03818: stderr chunk (state=3): >>><<< 18911 1727096288.03821: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096288.03828: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096288.03832: _low_level_execute_command(): starting 18911 1727096288.03834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096287.3573942-19248-231913055323058/ > /dev/null 2>&1 && sleep 0' 18911 1727096288.05016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.05020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.05023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.05025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096288.05027: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.05197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.05210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.05297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.07201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.07232: stderr chunk (state=3): >>><<< 18911 1727096288.07241: stdout chunk (state=3): >>><<< 18911 1727096288.07263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.07278: handler run complete 18911 1727096288.07630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096288.07973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096288.08016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096288.08472: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096288.08476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096288.08478: variable '__install_status' from source: unknown 18911 1727096288.08480: Evaluated conditional (__install_status is success): True 18911 1727096288.08482: attempt loop complete, returning result 18911 1727096288.08484: _execute() done 18911 1727096288.08486: dumping result to json 18911 1727096288.08488: done dumping result, returning 18911 1727096288.08490: done running TaskExecutor() for managed_node1/TASK: Install iproute [0afff68d-5257-09a7-aae1-000000000134] 18911 1727096288.08492: sending task result for task 0afff68d-5257-09a7-aae1-000000000134 ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 18911 1727096288.08791: no more pending results, returning what we have 18911 1727096288.08794: results queue empty 18911 1727096288.08795: checking for any_errors_fatal 18911 1727096288.08800: done checking for any_errors_fatal 18911 1727096288.08801: checking for max_fail_percentage 18911 1727096288.08803: done checking for max_fail_percentage 18911 1727096288.08804: checking to see if all hosts have failed and the running result is not ok 18911 1727096288.08804: done checking to see if all hosts have failed 18911 1727096288.08805: getting the remaining hosts for this loop 18911 1727096288.08806: done getting the remaining hosts for this loop 18911 1727096288.08810: getting the next task for host managed_node1 18911 1727096288.08816: done getting next task for host managed_node1 18911 1727096288.08818: ^ task is: TASK: Create veth interface {{ interface }} 18911 1727096288.08821: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096288.08825: getting variables 18911 1727096288.08826: in VariableManager get_vars() 18911 1727096288.08856: Calling all_inventory to load vars for managed_node1 18911 1727096288.08859: Calling groups_inventory to load vars for managed_node1 18911 1727096288.08866: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096288.08877: done sending task result for task 0afff68d-5257-09a7-aae1-000000000134 18911 1727096288.08880: WORKER PROCESS EXITING 18911 1727096288.08890: Calling all_plugins_play to load vars for managed_node1 18911 1727096288.08893: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096288.08896: Calling groups_plugins_play to load vars for managed_node1 18911 1727096288.09289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096288.09824: done with get_vars() 18911 1727096288.09838: done getting variables 18911 1727096288.09900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096288.10263: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 08:58:08 -0400 (0:00:00.807) 0:00:07.217 ****** 18911 1727096288.10295: entering _queue_task() for managed_node1/command 18911 1727096288.10933: worker is 1 (out of 1 available) 18911 1727096288.10946: exiting _queue_task() for managed_node1/command 18911 1727096288.10957: done queuing things up, now waiting for results queue to drain 18911 1727096288.10959: waiting for pending results... 18911 1727096288.11422: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 18911 1727096288.11534: in run() - task 0afff68d-5257-09a7-aae1-000000000135 18911 1727096288.11973: variable 'ansible_search_path' from source: unknown 18911 1727096288.11977: variable 'ansible_search_path' from source: unknown 18911 1727096288.12255: variable 'interface' from source: set_fact 18911 1727096288.12343: variable 'interface' from source: set_fact 18911 1727096288.12416: variable 'interface' from source: set_fact 18911 1727096288.12811: Loaded config def from plugin (lookup/items) 18911 1727096288.12823: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 18911 1727096288.12849: variable 'omit' from source: magic vars 18911 1727096288.12969: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.13372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.13375: variable 'omit' from source: magic vars 18911 1727096288.14155: variable 'ansible_distribution_major_version' from source: facts 18911 1727096288.14384: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096288.14786: variable 'type' from source: set_fact 18911 1727096288.14796: variable 'state' from source: include params 18911 1727096288.14805: variable 'interface' from source: set_fact 18911 1727096288.14813: variable 'current_interfaces' from source: set_fact 18911 1727096288.14824: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18911 1727096288.14835: variable 'omit' from source: magic vars 18911 1727096288.14880: variable 'omit' from source: magic vars 18911 1727096288.14930: variable 'item' from source: unknown 18911 1727096288.15372: variable 'item' from source: unknown 18911 1727096288.15375: variable 'omit' from source: magic vars 18911 1727096288.15380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096288.15383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096288.15386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096288.15388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096288.15391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096288.15393: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096288.15395: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.15397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.15664: Set connection var ansible_shell_executable to /bin/sh 18911 1727096288.15681: Set connection var ansible_timeout to 10 18911 1727096288.15689: Set connection var ansible_shell_type to sh 18911 1727096288.15701: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096288.15710: Set connection var ansible_pipelining to False 18911 1727096288.15719: Set connection var ansible_connection to ssh 18911 1727096288.15746: variable 'ansible_shell_executable' from source: unknown 18911 1727096288.15755: variable 'ansible_connection' from source: unknown 18911 1727096288.15763: variable 'ansible_module_compression' from source: unknown 18911 1727096288.15773: variable 'ansible_shell_type' from source: unknown 18911 1727096288.15781: variable 'ansible_shell_executable' from source: unknown 18911 1727096288.16176: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.16180: variable 'ansible_pipelining' from source: unknown 18911 1727096288.16182: variable 'ansible_timeout' from source: unknown 18911 1727096288.16184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.16187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096288.16190: variable 'omit' from source: magic vars 18911 1727096288.16192: starting attempt loop 18911 1727096288.16194: running the handler 18911 1727096288.16195: _low_level_execute_command(): starting 18911 1727096288.16197: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096288.17627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096288.17646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.17828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.17843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.17931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.19652: stdout chunk (state=3): >>>/root <<< 18911 1727096288.19828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.19842: stdout chunk (state=3): >>><<< 18911 1727096288.20217: stderr chunk (state=3): >>><<< 18911 1727096288.20330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.20334: _low_level_execute_command(): starting 18911 1727096288.20337: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152 `" && echo ansible-tmp-1727096288.202403-19294-161054686685152="` echo /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152 `" ) && sleep 0' 18911 1727096288.21405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096288.21421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.21521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096288.21817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.21899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.23859: stdout chunk (state=3): >>>ansible-tmp-1727096288.202403-19294-161054686685152=/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152 <<< 18911 1727096288.23999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.24010: stdout chunk (state=3): >>><<< 18911 1727096288.24024: stderr chunk (state=3): >>><<< 18911 1727096288.24046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096288.202403-19294-161054686685152=/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.24085: variable 'ansible_module_compression' from source: unknown 18911 1727096288.24372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096288.24375: variable 'ansible_facts' from source: unknown 18911 1727096288.24431: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py 18911 1727096288.25097: Sending initial data 18911 1727096288.25100: Sent initial data (155 bytes) 18911 1727096288.26286: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.26508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.26577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.28271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096288.28328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096288.28392: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp6yskt_z9 /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py <<< 18911 1727096288.28395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py" <<< 18911 1727096288.28478: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp6yskt_z9" to remote "/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py" <<< 18911 1727096288.29965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.29982: stdout chunk (state=3): >>><<< 18911 1727096288.29994: stderr chunk (state=3): >>><<< 18911 1727096288.30019: done transferring module to remote 18911 1727096288.30033: _low_level_execute_command(): starting 18911 1727096288.30091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/ /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py && sleep 0' 18911 1727096288.31319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.31379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.31382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.31684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.33419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.33541: stderr chunk (state=3): >>><<< 18911 1727096288.33551: stdout chunk (state=3): >>><<< 18911 1727096288.33580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.33648: _low_level_execute_command(): starting 18911 1727096288.33660: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/AnsiballZ_command.py && sleep 0' 18911 1727096288.35074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.35226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.35786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.51631: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-23 08:58:08.508914", "end": "2024-09-23 08:58:08.513978", "delta": "0:00:00.005064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096288.54035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096288.54039: stdout chunk (state=3): >>><<< 18911 1727096288.54042: stderr chunk (state=3): >>><<< 18911 1727096288.54070: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-23 08:58:08.508914", "end": "2024-09-23 08:58:08.513978", "delta": "0:00:00.005064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096288.54276: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096288.54280: _low_level_execute_command(): starting 18911 1727096288.54282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096288.202403-19294-161054686685152/ > /dev/null 2>&1 && sleep 0' 18911 1727096288.55286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.55289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.55292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096288.55294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.55584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.55694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.59956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.59966: stdout chunk (state=3): >>><<< 18911 1727096288.59982: stderr chunk (state=3): >>><<< 18911 1727096288.60018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.60218: handler run complete 18911 1727096288.60221: Evaluated conditional (False): False 18911 1727096288.60224: attempt loop complete, returning result 18911 1727096288.60226: variable 'item' from source: unknown 18911 1727096288.60307: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005064", "end": "2024-09-23 08:58:08.513978", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-23 08:58:08.508914" } 18911 1727096288.60809: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.60811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.60814: variable 'omit' from source: magic vars 18911 1727096288.61375: variable 'ansible_distribution_major_version' from source: facts 18911 1727096288.61378: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096288.61542: variable 'type' from source: set_fact 18911 1727096288.61635: variable 'state' from source: include params 18911 1727096288.61639: variable 'interface' from source: set_fact 18911 1727096288.61642: variable 'current_interfaces' from source: set_fact 18911 1727096288.61644: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18911 1727096288.61647: variable 'omit' from source: magic vars 18911 1727096288.61649: variable 'omit' from source: magic vars 18911 1727096288.61853: variable 'item' from source: unknown 18911 1727096288.61884: variable 'item' from source: unknown 18911 1727096288.61935: variable 'omit' from source: magic vars 18911 1727096288.62073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096288.62076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096288.62079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096288.62081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096288.62083: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.62085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.62289: Set connection var ansible_shell_executable to /bin/sh 18911 1727096288.62292: Set connection var ansible_timeout to 10 18911 1727096288.62294: Set connection var ansible_shell_type to sh 18911 1727096288.62296: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096288.62298: Set connection var ansible_pipelining to False 18911 1727096288.62359: Set connection var ansible_connection to ssh 18911 1727096288.62386: variable 'ansible_shell_executable' from source: unknown 18911 1727096288.62399: variable 'ansible_connection' from source: unknown 18911 1727096288.62506: variable 'ansible_module_compression' from source: unknown 18911 1727096288.62509: variable 'ansible_shell_type' from source: unknown 18911 1727096288.62511: variable 'ansible_shell_executable' from source: unknown 18911 1727096288.62513: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096288.62515: variable 'ansible_pipelining' from source: unknown 18911 1727096288.62517: variable 'ansible_timeout' from source: unknown 18911 1727096288.62519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096288.62653: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096288.62736: variable 'omit' from source: magic vars 18911 1727096288.62747: starting attempt loop 18911 1727096288.62754: running the handler 18911 1727096288.62766: _low_level_execute_command(): starting 18911 1727096288.62776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096288.64048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096288.64064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.64085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096288.64203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.64328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.64422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.64547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.66250: stdout chunk (state=3): >>>/root <<< 18911 1727096288.66536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.66539: stdout chunk (state=3): >>><<< 18911 1727096288.66541: stderr chunk (state=3): >>><<< 18911 1727096288.66548: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.66551: _low_level_execute_command(): starting 18911 1727096288.66553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616 `" && echo ansible-tmp-1727096288.6651597-19294-78112885474616="` echo /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616 `" ) && sleep 0' 18911 1727096288.67985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.68038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.68135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096288.68161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.68259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.70229: stdout chunk (state=3): >>>ansible-tmp-1727096288.6651597-19294-78112885474616=/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616 <<< 18911 1727096288.70350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.70501: stderr chunk (state=3): >>><<< 18911 1727096288.70505: stdout chunk (state=3): >>><<< 18911 1727096288.70521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096288.6651597-19294-78112885474616=/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.70587: variable 'ansible_module_compression' from source: unknown 18911 1727096288.70774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096288.70785: variable 'ansible_facts' from source: unknown 18911 1727096288.71003: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py 18911 1727096288.71296: Sending initial data 18911 1727096288.71300: Sent initial data (155 bytes) 18911 1727096288.72319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.72322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096288.72324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.72326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096288.72328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.72493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.72506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.72601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.74318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18911 1727096288.74335: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096288.74448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096288.74494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp7fpna846 /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py <<< 18911 1727096288.74518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py" <<< 18911 1727096288.74672: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp7fpna846" to remote "/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py" <<< 18911 1727096288.75984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.76039: stderr chunk (state=3): >>><<< 18911 1727096288.76049: stdout chunk (state=3): >>><<< 18911 1727096288.76105: done transferring module to remote 18911 1727096288.76365: _low_level_execute_command(): starting 18911 1727096288.76370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/ /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py && sleep 0' 18911 1727096288.77437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.77459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.77476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.77622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.77634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096288.77720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.79611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096288.79649: stderr chunk (state=3): >>><<< 18911 1727096288.79658: stdout chunk (state=3): >>><<< 18911 1727096288.79974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096288.79978: _low_level_execute_command(): starting 18911 1727096288.79981: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/AnsiballZ_command.py && sleep 0' 18911 1727096288.81018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.81031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096288.81051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096288.81199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096288.81210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096288.81389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096288.97304: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-23 08:58:08.966625", "end": "2024-09-23 08:58:08.970250", "delta": "0:00:00.003625", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096288.98716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096288.98751: stderr chunk (state=3): >>><<< 18911 1727096288.98754: stdout chunk (state=3): >>><<< 18911 1727096288.98777: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-23 08:58:08.966625", "end": "2024-09-23 08:58:08.970250", "delta": "0:00:00.003625", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096288.98824: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096288.99022: _low_level_execute_command(): starting 18911 1727096288.99025: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096288.6651597-19294-78112885474616/ > /dev/null 2>&1 && sleep 0' 18911 1727096289.00044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.00149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.00296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.00446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.02421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.02466: stdout chunk (state=3): >>><<< 18911 1727096289.02673: stderr chunk (state=3): >>><<< 18911 1727096289.02677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.02680: handler run complete 18911 1727096289.02682: Evaluated conditional (False): False 18911 1727096289.02686: attempt loop complete, returning result 18911 1727096289.02689: variable 'item' from source: unknown 18911 1727096289.02896: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003625", "end": "2024-09-23 08:58:08.970250", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-23 08:58:08.966625" } 18911 1727096289.03265: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.03270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.03273: variable 'omit' from source: magic vars 18911 1727096289.03856: variable 'ansible_distribution_major_version' from source: facts 18911 1727096289.03860: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096289.04184: variable 'type' from source: set_fact 18911 1727096289.04187: variable 'state' from source: include params 18911 1727096289.04190: variable 'interface' from source: set_fact 18911 1727096289.04192: variable 'current_interfaces' from source: set_fact 18911 1727096289.04194: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18911 1727096289.04196: variable 'omit' from source: magic vars 18911 1727096289.04223: variable 'omit' from source: magic vars 18911 1727096289.04327: variable 'item' from source: unknown 18911 1727096289.04447: variable 'item' from source: unknown 18911 1727096289.04527: variable 'omit' from source: magic vars 18911 1727096289.04554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096289.04582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096289.04629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096289.04782: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096289.04786: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.04788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.04978: Set connection var ansible_shell_executable to /bin/sh 18911 1727096289.05392: Set connection var ansible_timeout to 10 18911 1727096289.05395: Set connection var ansible_shell_type to sh 18911 1727096289.05398: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096289.05400: Set connection var ansible_pipelining to False 18911 1727096289.05402: Set connection var ansible_connection to ssh 18911 1727096289.05404: variable 'ansible_shell_executable' from source: unknown 18911 1727096289.05406: variable 'ansible_connection' from source: unknown 18911 1727096289.05408: variable 'ansible_module_compression' from source: unknown 18911 1727096289.05410: variable 'ansible_shell_type' from source: unknown 18911 1727096289.05412: variable 'ansible_shell_executable' from source: unknown 18911 1727096289.05414: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.05416: variable 'ansible_pipelining' from source: unknown 18911 1727096289.05418: variable 'ansible_timeout' from source: unknown 18911 1727096289.05420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.05718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096289.05930: variable 'omit' from source: magic vars 18911 1727096289.05934: starting attempt loop 18911 1727096289.05936: running the handler 18911 1727096289.05942: _low_level_execute_command(): starting 18911 1727096289.05945: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096289.07598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.07612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.07696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.07886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.07890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.08084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.08355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.10374: stdout chunk (state=3): >>>/root <<< 18911 1727096289.10379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.10381: stderr chunk (state=3): >>><<< 18911 1727096289.10383: stdout chunk (state=3): >>><<< 18911 1727096289.10387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.10390: _low_level_execute_command(): starting 18911 1727096289.10392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903 `" && echo ansible-tmp-1727096289.1021788-19294-214881252244903="` echo /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903 `" ) && sleep 0' 18911 1727096289.11577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.11670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.11900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.11904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.13828: stdout chunk (state=3): >>>ansible-tmp-1727096289.1021788-19294-214881252244903=/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903 <<< 18911 1727096289.14075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.14079: stdout chunk (state=3): >>><<< 18911 1727096289.14081: stderr chunk (state=3): >>><<< 18911 1727096289.14095: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096289.1021788-19294-214881252244903=/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.14117: variable 'ansible_module_compression' from source: unknown 18911 1727096289.14182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096289.14186: variable 'ansible_facts' from source: unknown 18911 1727096289.14366: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py 18911 1727096289.14655: Sending initial data 18911 1727096289.14658: Sent initial data (156 bytes) 18911 1727096289.15290: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.15332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.15352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.15369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.15521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.17116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096289.17248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096289.17283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpu6gt79sn /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py <<< 18911 1727096289.17287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py" <<< 18911 1727096289.17350: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpu6gt79sn" to remote "/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py" <<< 18911 1727096289.18896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.18900: stdout chunk (state=3): >>><<< 18911 1727096289.18906: stderr chunk (state=3): >>><<< 18911 1727096289.18927: done transferring module to remote 18911 1727096289.19082: _low_level_execute_command(): starting 18911 1727096289.19085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/ /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py && sleep 0' 18911 1727096289.20385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.20396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.20408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.20422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096289.20778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096289.20782: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.20869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.20928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.22818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.22822: stdout chunk (state=3): >>><<< 18911 1727096289.22825: stderr chunk (state=3): >>><<< 18911 1727096289.22847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.22851: _low_level_execute_command(): starting 18911 1727096289.22855: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/AnsiballZ_command.py && sleep 0' 18911 1727096289.24173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.24177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.24180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.24182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096289.24185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096289.24187: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096289.24189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.24191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096289.24193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096289.24196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18911 1727096289.24198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.24200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.24211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096289.24252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096289.24258: stderr chunk (state=3): >>>debug2: match found <<< 18911 1727096289.24456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.24472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.24581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.40607: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-23 08:58:09.399308", "end": "2024-09-23 08:58:09.403299", "delta": "0:00:00.003991", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096289.42373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096289.42378: stdout chunk (state=3): >>><<< 18911 1727096289.42394: stderr chunk (state=3): >>><<< 18911 1727096289.42412: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-23 08:58:09.399308", "end": "2024-09-23 08:58:09.403299", "delta": "0:00:00.003991", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096289.42442: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096289.42447: _low_level_execute_command(): starting 18911 1727096289.42453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096289.1021788-19294-214881252244903/ > /dev/null 2>&1 && sleep 0' 18911 1727096289.43696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.43821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.43933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.43955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.43971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.44066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.45990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.46044: stderr chunk (state=3): >>><<< 18911 1727096289.46047: stdout chunk (state=3): >>><<< 18911 1727096289.46116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.46119: handler run complete 18911 1727096289.46138: Evaluated conditional (False): False 18911 1727096289.46147: attempt loop complete, returning result 18911 1727096289.46167: variable 'item' from source: unknown 18911 1727096289.46350: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003991", "end": "2024-09-23 08:58:09.403299", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-23 08:58:09.399308" } 18911 1727096289.46635: dumping result to json 18911 1727096289.46639: done dumping result, returning 18911 1727096289.46642: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 [0afff68d-5257-09a7-aae1-000000000135] 18911 1727096289.46644: sending task result for task 0afff68d-5257-09a7-aae1-000000000135 18911 1727096289.47784: done sending task result for task 0afff68d-5257-09a7-aae1-000000000135 18911 1727096289.47787: WORKER PROCESS EXITING 18911 1727096289.47913: no more pending results, returning what we have 18911 1727096289.47916: results queue empty 18911 1727096289.47917: checking for any_errors_fatal 18911 1727096289.47920: done checking for any_errors_fatal 18911 1727096289.47921: checking for max_fail_percentage 18911 1727096289.47929: done checking for max_fail_percentage 18911 1727096289.47930: checking to see if all hosts have failed and the running result is not ok 18911 1727096289.47931: done checking to see if all hosts have failed 18911 1727096289.47931: getting the remaining hosts for this loop 18911 1727096289.47932: done getting the remaining hosts for this loop 18911 1727096289.47935: getting the next task for host managed_node1 18911 1727096289.47940: done getting next task for host managed_node1 18911 1727096289.47942: ^ task is: TASK: Set up veth as managed by NetworkManager 18911 1727096289.47944: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096289.47947: getting variables 18911 1727096289.47948: in VariableManager get_vars() 18911 1727096289.47973: Calling all_inventory to load vars for managed_node1 18911 1727096289.47975: Calling groups_inventory to load vars for managed_node1 18911 1727096289.47978: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096289.47987: Calling all_plugins_play to load vars for managed_node1 18911 1727096289.47989: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096289.47992: Calling groups_plugins_play to load vars for managed_node1 18911 1727096289.48366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096289.49017: done with get_vars() 18911 1727096289.49027: done getting variables 18911 1727096289.49309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 08:58:09 -0400 (0:00:01.390) 0:00:08.607 ****** 18911 1727096289.49336: entering _queue_task() for managed_node1/command 18911 1727096289.50206: worker is 1 (out of 1 available) 18911 1727096289.50333: exiting _queue_task() for managed_node1/command 18911 1727096289.50344: done queuing things up, now waiting for results queue to drain 18911 1727096289.50345: waiting for pending results... 18911 1727096289.50789: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 18911 1727096289.50900: in run() - task 0afff68d-5257-09a7-aae1-000000000136 18911 1727096289.51076: variable 'ansible_search_path' from source: unknown 18911 1727096289.51080: variable 'ansible_search_path' from source: unknown 18911 1727096289.51083: calling self._execute() 18911 1727096289.51155: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.51284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.51427: variable 'omit' from source: magic vars 18911 1727096289.52029: variable 'ansible_distribution_major_version' from source: facts 18911 1727096289.52089: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096289.52453: variable 'type' from source: set_fact 18911 1727096289.52465: variable 'state' from source: include params 18911 1727096289.52478: Evaluated conditional (type == 'veth' and state == 'present'): True 18911 1727096289.52499: variable 'omit' from source: magic vars 18911 1727096289.52617: variable 'omit' from source: magic vars 18911 1727096289.52927: variable 'interface' from source: set_fact 18911 1727096289.52933: variable 'omit' from source: magic vars 18911 1727096289.52985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096289.53282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096289.53285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096289.53287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096289.53289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096289.53291: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096289.53293: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.53294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.53472: Set connection var ansible_shell_executable to /bin/sh 18911 1727096289.53625: Set connection var ansible_timeout to 10 18911 1727096289.53628: Set connection var ansible_shell_type to sh 18911 1727096289.53715: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096289.53718: Set connection var ansible_pipelining to False 18911 1727096289.53721: Set connection var ansible_connection to ssh 18911 1727096289.53723: variable 'ansible_shell_executable' from source: unknown 18911 1727096289.53725: variable 'ansible_connection' from source: unknown 18911 1727096289.53728: variable 'ansible_module_compression' from source: unknown 18911 1727096289.53731: variable 'ansible_shell_type' from source: unknown 18911 1727096289.53733: variable 'ansible_shell_executable' from source: unknown 18911 1727096289.53735: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.53737: variable 'ansible_pipelining' from source: unknown 18911 1727096289.53739: variable 'ansible_timeout' from source: unknown 18911 1727096289.53741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.54075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096289.54079: variable 'omit' from source: magic vars 18911 1727096289.54090: starting attempt loop 18911 1727096289.54099: running the handler 18911 1727096289.54118: _low_level_execute_command(): starting 18911 1727096289.54158: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096289.55805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.56148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.56152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.56193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.56294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.58086: stdout chunk (state=3): >>>/root <<< 18911 1727096289.58130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.58176: stderr chunk (state=3): >>><<< 18911 1727096289.58383: stdout chunk (state=3): >>><<< 18911 1727096289.58388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.58391: _low_level_execute_command(): starting 18911 1727096289.58396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944 `" && echo ansible-tmp-1727096289.583081-19343-158762385365944="` echo /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944 `" ) && sleep 0' 18911 1727096289.59815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.59838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.59939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.61931: stdout chunk (state=3): >>>ansible-tmp-1727096289.583081-19343-158762385365944=/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944 <<< 18911 1727096289.62199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.62204: stdout chunk (state=3): >>><<< 18911 1727096289.62207: stderr chunk (state=3): >>><<< 18911 1727096289.62211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096289.583081-19343-158762385365944=/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.62476: variable 'ansible_module_compression' from source: unknown 18911 1727096289.62480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096289.62482: variable 'ansible_facts' from source: unknown 18911 1727096289.62626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py 18911 1727096289.62944: Sending initial data 18911 1727096289.62954: Sent initial data (155 bytes) 18911 1727096289.64106: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.64125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.64145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.64236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.64269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.64291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.64302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.64419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.66142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096289.66212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096289.66373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpw_qksh0u /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py <<< 18911 1727096289.66376: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py" <<< 18911 1727096289.66448: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpw_qksh0u" to remote "/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py" <<< 18911 1727096289.68021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.68098: stderr chunk (state=3): >>><<< 18911 1727096289.68119: stdout chunk (state=3): >>><<< 18911 1727096289.68259: done transferring module to remote 18911 1727096289.68263: _low_level_execute_command(): starting 18911 1727096289.68266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/ /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py && sleep 0' 18911 1727096289.69288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096289.69430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.69516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.69603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.69637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.69737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.71978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.71982: stdout chunk (state=3): >>><<< 18911 1727096289.71984: stderr chunk (state=3): >>><<< 18911 1727096289.71992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.71995: _low_level_execute_command(): starting 18911 1727096289.71997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/AnsiballZ_command.py && sleep 0' 18911 1727096289.72880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096289.72898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096289.72913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.73034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.73142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.73158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.73372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.90606: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-23 08:58:09.885969", "end": "2024-09-23 08:58:09.903608", "delta": "0:00:00.017639", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096289.92212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096289.92226: stderr chunk (state=3): >>><<< 18911 1727096289.92229: stdout chunk (state=3): >>><<< 18911 1727096289.92255: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-23 08:58:09.885969", "end": "2024-09-23 08:58:09.903608", "delta": "0:00:00.017639", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096289.92287: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096289.92294: _low_level_execute_command(): starting 18911 1727096289.92299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096289.583081-19343-158762385365944/ > /dev/null 2>&1 && sleep 0' 18911 1727096289.92882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.92885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.92888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096289.92890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096289.92939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096289.92946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096289.92948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096289.93012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096289.94889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096289.94901: stderr chunk (state=3): >>><<< 18911 1727096289.94904: stdout chunk (state=3): >>><<< 18911 1727096289.94924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096289.94931: handler run complete 18911 1727096289.94948: Evaluated conditional (False): False 18911 1727096289.94956: attempt loop complete, returning result 18911 1727096289.94958: _execute() done 18911 1727096289.94961: dumping result to json 18911 1727096289.94970: done dumping result, returning 18911 1727096289.94978: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-09a7-aae1-000000000136] 18911 1727096289.94981: sending task result for task 0afff68d-5257-09a7-aae1-000000000136 18911 1727096289.95078: done sending task result for task 0afff68d-5257-09a7-aae1-000000000136 18911 1727096289.95080: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.017639", "end": "2024-09-23 08:58:09.903608", "rc": 0, "start": "2024-09-23 08:58:09.885969" } 18911 1727096289.95136: no more pending results, returning what we have 18911 1727096289.95139: results queue empty 18911 1727096289.95140: checking for any_errors_fatal 18911 1727096289.95157: done checking for any_errors_fatal 18911 1727096289.95158: checking for max_fail_percentage 18911 1727096289.95159: done checking for max_fail_percentage 18911 1727096289.95160: checking to see if all hosts have failed and the running result is not ok 18911 1727096289.95161: done checking to see if all hosts have failed 18911 1727096289.95161: getting the remaining hosts for this loop 18911 1727096289.95162: done getting the remaining hosts for this loop 18911 1727096289.95165: getting the next task for host managed_node1 18911 1727096289.95174: done getting next task for host managed_node1 18911 1727096289.95176: ^ task is: TASK: Delete veth interface {{ interface }} 18911 1727096289.95179: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096289.95182: getting variables 18911 1727096289.95184: in VariableManager get_vars() 18911 1727096289.95213: Calling all_inventory to load vars for managed_node1 18911 1727096289.95215: Calling groups_inventory to load vars for managed_node1 18911 1727096289.95218: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096289.95229: Calling all_plugins_play to load vars for managed_node1 18911 1727096289.95232: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096289.95234: Calling groups_plugins_play to load vars for managed_node1 18911 1727096289.95385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096289.95529: done with get_vars() 18911 1727096289.95536: done getting variables 18911 1727096289.95584: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096289.95675: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 08:58:09 -0400 (0:00:00.463) 0:00:09.071 ****** 18911 1727096289.95697: entering _queue_task() for managed_node1/command 18911 1727096289.95903: worker is 1 (out of 1 available) 18911 1727096289.95930: exiting _queue_task() for managed_node1/command 18911 1727096289.95941: done queuing things up, now waiting for results queue to drain 18911 1727096289.95943: waiting for pending results... 18911 1727096289.96090: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 18911 1727096289.96152: in run() - task 0afff68d-5257-09a7-aae1-000000000137 18911 1727096289.96170: variable 'ansible_search_path' from source: unknown 18911 1727096289.96178: variable 'ansible_search_path' from source: unknown 18911 1727096289.96200: calling self._execute() 18911 1727096289.96263: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.96268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.96275: variable 'omit' from source: magic vars 18911 1727096289.96555: variable 'ansible_distribution_major_version' from source: facts 18911 1727096289.96566: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096289.96845: variable 'type' from source: set_fact 18911 1727096289.96890: variable 'state' from source: include params 18911 1727096289.96893: variable 'interface' from source: set_fact 18911 1727096289.96894: variable 'current_interfaces' from source: set_fact 18911 1727096289.96896: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 18911 1727096289.96898: when evaluation is False, skipping this task 18911 1727096289.96900: _execute() done 18911 1727096289.96902: dumping result to json 18911 1727096289.96903: done dumping result, returning 18911 1727096289.96905: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 [0afff68d-5257-09a7-aae1-000000000137] 18911 1727096289.96907: sending task result for task 0afff68d-5257-09a7-aae1-000000000137 18911 1727096289.96973: done sending task result for task 0afff68d-5257-09a7-aae1-000000000137 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18911 1727096289.97029: no more pending results, returning what we have 18911 1727096289.97036: results queue empty 18911 1727096289.97038: checking for any_errors_fatal 18911 1727096289.97044: done checking for any_errors_fatal 18911 1727096289.97045: checking for max_fail_percentage 18911 1727096289.97047: done checking for max_fail_percentage 18911 1727096289.97048: checking to see if all hosts have failed and the running result is not ok 18911 1727096289.97049: done checking to see if all hosts have failed 18911 1727096289.97050: getting the remaining hosts for this loop 18911 1727096289.97051: done getting the remaining hosts for this loop 18911 1727096289.97054: getting the next task for host managed_node1 18911 1727096289.97064: done getting next task for host managed_node1 18911 1727096289.97068: ^ task is: TASK: Create dummy interface {{ interface }} 18911 1727096289.97071: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096289.97074: getting variables 18911 1727096289.97076: in VariableManager get_vars() 18911 1727096289.97108: Calling all_inventory to load vars for managed_node1 18911 1727096289.97111: Calling groups_inventory to load vars for managed_node1 18911 1727096289.97114: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096289.97124: Calling all_plugins_play to load vars for managed_node1 18911 1727096289.97127: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096289.97135: Calling groups_plugins_play to load vars for managed_node1 18911 1727096289.97291: WORKER PROCESS EXITING 18911 1727096289.97301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096289.97421: done with get_vars() 18911 1727096289.97428: done getting variables 18911 1727096289.97474: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096289.97582: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 08:58:09 -0400 (0:00:00.019) 0:00:09.090 ****** 18911 1727096289.97613: entering _queue_task() for managed_node1/command 18911 1727096289.97852: worker is 1 (out of 1 available) 18911 1727096289.97865: exiting _queue_task() for managed_node1/command 18911 1727096289.97879: done queuing things up, now waiting for results queue to drain 18911 1727096289.97881: waiting for pending results... 18911 1727096289.98088: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 18911 1727096289.98152: in run() - task 0afff68d-5257-09a7-aae1-000000000138 18911 1727096289.98169: variable 'ansible_search_path' from source: unknown 18911 1727096289.98174: variable 'ansible_search_path' from source: unknown 18911 1727096289.98230: calling self._execute() 18911 1727096289.98296: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096289.98326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096289.98329: variable 'omit' from source: magic vars 18911 1727096289.99222: variable 'ansible_distribution_major_version' from source: facts 18911 1727096289.99260: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096289.99760: variable 'type' from source: set_fact 18911 1727096289.99770: variable 'state' from source: include params 18911 1727096289.99773: variable 'interface' from source: set_fact 18911 1727096289.99776: variable 'current_interfaces' from source: set_fact 18911 1727096289.99779: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 18911 1727096289.99782: when evaluation is False, skipping this task 18911 1727096289.99886: _execute() done 18911 1727096289.99890: dumping result to json 18911 1727096289.99893: done dumping result, returning 18911 1727096289.99895: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 [0afff68d-5257-09a7-aae1-000000000138] 18911 1727096289.99898: sending task result for task 0afff68d-5257-09a7-aae1-000000000138 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18911 1727096290.00139: no more pending results, returning what we have 18911 1727096290.00143: results queue empty 18911 1727096290.00144: checking for any_errors_fatal 18911 1727096290.00150: done checking for any_errors_fatal 18911 1727096290.00151: checking for max_fail_percentage 18911 1727096290.00152: done checking for max_fail_percentage 18911 1727096290.00153: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.00154: done checking to see if all hosts have failed 18911 1727096290.00154: getting the remaining hosts for this loop 18911 1727096290.00155: done getting the remaining hosts for this loop 18911 1727096290.00158: getting the next task for host managed_node1 18911 1727096290.00171: done getting next task for host managed_node1 18911 1727096290.00173: ^ task is: TASK: Delete dummy interface {{ interface }} 18911 1727096290.00176: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.00179: getting variables 18911 1727096290.00181: in VariableManager get_vars() 18911 1727096290.00207: Calling all_inventory to load vars for managed_node1 18911 1727096290.00210: Calling groups_inventory to load vars for managed_node1 18911 1727096290.00212: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.00221: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.00224: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.00226: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.00447: done sending task result for task 0afff68d-5257-09a7-aae1-000000000138 18911 1727096290.00452: WORKER PROCESS EXITING 18911 1727096290.00507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.00698: done with get_vars() 18911 1727096290.00705: done getting variables 18911 1727096290.00748: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096290.00827: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 08:58:10 -0400 (0:00:00.032) 0:00:09.122 ****** 18911 1727096290.00848: entering _queue_task() for managed_node1/command 18911 1727096290.01044: worker is 1 (out of 1 available) 18911 1727096290.01057: exiting _queue_task() for managed_node1/command 18911 1727096290.01069: done queuing things up, now waiting for results queue to drain 18911 1727096290.01071: waiting for pending results... 18911 1727096290.01221: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 18911 1727096290.01284: in run() - task 0afff68d-5257-09a7-aae1-000000000139 18911 1727096290.01303: variable 'ansible_search_path' from source: unknown 18911 1727096290.01306: variable 'ansible_search_path' from source: unknown 18911 1727096290.01338: calling self._execute() 18911 1727096290.01473: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.01477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.01480: variable 'omit' from source: magic vars 18911 1727096290.01846: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.01948: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.02143: variable 'type' from source: set_fact 18911 1727096290.02154: variable 'state' from source: include params 18911 1727096290.02163: variable 'interface' from source: set_fact 18911 1727096290.02175: variable 'current_interfaces' from source: set_fact 18911 1727096290.02188: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 18911 1727096290.02196: when evaluation is False, skipping this task 18911 1727096290.02204: _execute() done 18911 1727096290.02272: dumping result to json 18911 1727096290.02275: done dumping result, returning 18911 1727096290.02278: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 [0afff68d-5257-09a7-aae1-000000000139] 18911 1727096290.02280: sending task result for task 0afff68d-5257-09a7-aae1-000000000139 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18911 1727096290.02580: no more pending results, returning what we have 18911 1727096290.02583: results queue empty 18911 1727096290.02584: checking for any_errors_fatal 18911 1727096290.02588: done checking for any_errors_fatal 18911 1727096290.02588: checking for max_fail_percentage 18911 1727096290.02589: done checking for max_fail_percentage 18911 1727096290.02590: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.02591: done checking to see if all hosts have failed 18911 1727096290.02592: getting the remaining hosts for this loop 18911 1727096290.02592: done getting the remaining hosts for this loop 18911 1727096290.02595: getting the next task for host managed_node1 18911 1727096290.02600: done getting next task for host managed_node1 18911 1727096290.02602: ^ task is: TASK: Create tap interface {{ interface }} 18911 1727096290.02605: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.02608: getting variables 18911 1727096290.02609: in VariableManager get_vars() 18911 1727096290.02633: Calling all_inventory to load vars for managed_node1 18911 1727096290.02636: Calling groups_inventory to load vars for managed_node1 18911 1727096290.02639: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.02648: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.02650: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.02653: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.02801: done sending task result for task 0afff68d-5257-09a7-aae1-000000000139 18911 1727096290.02805: WORKER PROCESS EXITING 18911 1727096290.02826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.03013: done with get_vars() 18911 1727096290.03023: done getting variables 18911 1727096290.03091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096290.03210: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 08:58:10 -0400 (0:00:00.023) 0:00:09.146 ****** 18911 1727096290.03238: entering _queue_task() for managed_node1/command 18911 1727096290.03481: worker is 1 (out of 1 available) 18911 1727096290.03493: exiting _queue_task() for managed_node1/command 18911 1727096290.03506: done queuing things up, now waiting for results queue to drain 18911 1727096290.03507: waiting for pending results... 18911 1727096290.03741: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 18911 1727096290.04072: in run() - task 0afff68d-5257-09a7-aae1-00000000013a 18911 1727096290.04176: variable 'ansible_search_path' from source: unknown 18911 1727096290.04180: variable 'ansible_search_path' from source: unknown 18911 1727096290.04182: calling self._execute() 18911 1727096290.04315: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.04327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.04343: variable 'omit' from source: magic vars 18911 1727096290.05028: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.05056: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.05283: variable 'type' from source: set_fact 18911 1727096290.05293: variable 'state' from source: include params 18911 1727096290.05301: variable 'interface' from source: set_fact 18911 1727096290.05308: variable 'current_interfaces' from source: set_fact 18911 1727096290.05318: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 18911 1727096290.05324: when evaluation is False, skipping this task 18911 1727096290.05331: _execute() done 18911 1727096290.05337: dumping result to json 18911 1727096290.05344: done dumping result, returning 18911 1727096290.05353: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 [0afff68d-5257-09a7-aae1-00000000013a] 18911 1727096290.05361: sending task result for task 0afff68d-5257-09a7-aae1-00000000013a 18911 1727096290.05465: done sending task result for task 0afff68d-5257-09a7-aae1-00000000013a skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18911 1727096290.05521: no more pending results, returning what we have 18911 1727096290.05525: results queue empty 18911 1727096290.05525: checking for any_errors_fatal 18911 1727096290.05533: done checking for any_errors_fatal 18911 1727096290.05534: checking for max_fail_percentage 18911 1727096290.05536: done checking for max_fail_percentage 18911 1727096290.05536: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.05537: done checking to see if all hosts have failed 18911 1727096290.05538: getting the remaining hosts for this loop 18911 1727096290.05539: done getting the remaining hosts for this loop 18911 1727096290.05542: getting the next task for host managed_node1 18911 1727096290.05549: done getting next task for host managed_node1 18911 1727096290.05551: ^ task is: TASK: Delete tap interface {{ interface }} 18911 1727096290.05555: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.05558: getting variables 18911 1727096290.05560: in VariableManager get_vars() 18911 1727096290.05593: Calling all_inventory to load vars for managed_node1 18911 1727096290.05601: Calling groups_inventory to load vars for managed_node1 18911 1727096290.05605: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.05619: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.05622: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.05625: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.06100: WORKER PROCESS EXITING 18911 1727096290.06124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.06320: done with get_vars() 18911 1727096290.06329: done getting variables 18911 1727096290.06390: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096290.06550: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 08:58:10 -0400 (0:00:00.033) 0:00:09.180 ****** 18911 1727096290.06578: entering _queue_task() for managed_node1/command 18911 1727096290.06811: worker is 1 (out of 1 available) 18911 1727096290.06822: exiting _queue_task() for managed_node1/command 18911 1727096290.06836: done queuing things up, now waiting for results queue to drain 18911 1727096290.06837: waiting for pending results... 18911 1727096290.07079: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 18911 1727096290.07181: in run() - task 0afff68d-5257-09a7-aae1-00000000013b 18911 1727096290.07205: variable 'ansible_search_path' from source: unknown 18911 1727096290.07213: variable 'ansible_search_path' from source: unknown 18911 1727096290.07258: calling self._execute() 18911 1727096290.07348: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.07358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.07375: variable 'omit' from source: magic vars 18911 1727096290.07710: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.07727: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.07934: variable 'type' from source: set_fact 18911 1727096290.07945: variable 'state' from source: include params 18911 1727096290.07958: variable 'interface' from source: set_fact 18911 1727096290.07967: variable 'current_interfaces' from source: set_fact 18911 1727096290.07983: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 18911 1727096290.07991: when evaluation is False, skipping this task 18911 1727096290.07997: _execute() done 18911 1727096290.08006: dumping result to json 18911 1727096290.08015: done dumping result, returning 18911 1727096290.08031: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 [0afff68d-5257-09a7-aae1-00000000013b] 18911 1727096290.08042: sending task result for task 0afff68d-5257-09a7-aae1-00000000013b skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18911 1727096290.08403: no more pending results, returning what we have 18911 1727096290.08406: results queue empty 18911 1727096290.08407: checking for any_errors_fatal 18911 1727096290.08411: done checking for any_errors_fatal 18911 1727096290.08412: checking for max_fail_percentage 18911 1727096290.08414: done checking for max_fail_percentage 18911 1727096290.08414: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.08415: done checking to see if all hosts have failed 18911 1727096290.08416: getting the remaining hosts for this loop 18911 1727096290.08417: done getting the remaining hosts for this loop 18911 1727096290.08420: getting the next task for host managed_node1 18911 1727096290.08429: done getting next task for host managed_node1 18911 1727096290.08431: ^ task is: TASK: Include the task 'assert_device_present.yml' 18911 1727096290.08434: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.08437: getting variables 18911 1727096290.08439: in VariableManager get_vars() 18911 1727096290.08464: Calling all_inventory to load vars for managed_node1 18911 1727096290.08467: Calling groups_inventory to load vars for managed_node1 18911 1727096290.08474: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.08484: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.08487: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.08490: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.08957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.09465: done with get_vars() 18911 1727096290.09478: done getting variables 18911 1727096290.09579: done sending task result for task 0afff68d-5257-09a7-aae1-00000000013b 18911 1727096290.09586: WORKER PROCESS EXITING TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Monday 23 September 2024 08:58:10 -0400 (0:00:00.030) 0:00:09.210 ****** 18911 1727096290.09659: entering _queue_task() for managed_node1/include_tasks 18911 1727096290.10208: worker is 1 (out of 1 available) 18911 1727096290.10219: exiting _queue_task() for managed_node1/include_tasks 18911 1727096290.10230: done queuing things up, now waiting for results queue to drain 18911 1727096290.10232: waiting for pending results... 18911 1727096290.10564: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 18911 1727096290.10669: in run() - task 0afff68d-5257-09a7-aae1-000000000012 18911 1727096290.10691: variable 'ansible_search_path' from source: unknown 18911 1727096290.10741: calling self._execute() 18911 1727096290.10863: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.10902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.11017: variable 'omit' from source: magic vars 18911 1727096290.11959: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.11965: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.11974: _execute() done 18911 1727096290.11977: dumping result to json 18911 1727096290.11979: done dumping result, returning 18911 1727096290.11982: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-09a7-aae1-000000000012] 18911 1727096290.11985: sending task result for task 0afff68d-5257-09a7-aae1-000000000012 18911 1727096290.12139: no more pending results, returning what we have 18911 1727096290.12145: in VariableManager get_vars() 18911 1727096290.12185: Calling all_inventory to load vars for managed_node1 18911 1727096290.12188: Calling groups_inventory to load vars for managed_node1 18911 1727096290.12192: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.12207: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.12211: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.12214: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.12893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.13223: done with get_vars() 18911 1727096290.13231: variable 'ansible_search_path' from source: unknown 18911 1727096290.13262: done sending task result for task 0afff68d-5257-09a7-aae1-000000000012 18911 1727096290.13266: WORKER PROCESS EXITING 18911 1727096290.13274: we have included files to process 18911 1727096290.13275: generating all_blocks data 18911 1727096290.13280: done generating all_blocks data 18911 1727096290.13284: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18911 1727096290.13285: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18911 1727096290.13315: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18911 1727096290.13773: in VariableManager get_vars() 18911 1727096290.13790: done with get_vars() 18911 1727096290.13926: done processing included file 18911 1727096290.13928: iterating over new_blocks loaded from include file 18911 1727096290.13929: in VariableManager get_vars() 18911 1727096290.13940: done with get_vars() 18911 1727096290.13941: filtering new block on tags 18911 1727096290.13958: done filtering new block on tags 18911 1727096290.13960: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 18911 1727096290.13964: extending task lists for all hosts with included blocks 18911 1727096290.15407: done extending task lists 18911 1727096290.15409: done processing included files 18911 1727096290.15409: results queue empty 18911 1727096290.15410: checking for any_errors_fatal 18911 1727096290.15413: done checking for any_errors_fatal 18911 1727096290.15414: checking for max_fail_percentage 18911 1727096290.15415: done checking for max_fail_percentage 18911 1727096290.15416: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.15417: done checking to see if all hosts have failed 18911 1727096290.15418: getting the remaining hosts for this loop 18911 1727096290.15419: done getting the remaining hosts for this loop 18911 1727096290.15421: getting the next task for host managed_node1 18911 1727096290.15426: done getting next task for host managed_node1 18911 1727096290.15428: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18911 1727096290.15430: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.15433: getting variables 18911 1727096290.15434: in VariableManager get_vars() 18911 1727096290.15444: Calling all_inventory to load vars for managed_node1 18911 1727096290.15446: Calling groups_inventory to load vars for managed_node1 18911 1727096290.15449: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.15455: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.15457: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.15460: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.15813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.16234: done with get_vars() 18911 1727096290.16244: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:58:10 -0400 (0:00:00.067) 0:00:09.278 ****** 18911 1727096290.16436: entering _queue_task() for managed_node1/include_tasks 18911 1727096290.17146: worker is 1 (out of 1 available) 18911 1727096290.17157: exiting _queue_task() for managed_node1/include_tasks 18911 1727096290.17169: done queuing things up, now waiting for results queue to drain 18911 1727096290.17171: waiting for pending results... 18911 1727096290.17742: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18911 1727096290.18047: in run() - task 0afff68d-5257-09a7-aae1-0000000001d3 18911 1727096290.18060: variable 'ansible_search_path' from source: unknown 18911 1727096290.18063: variable 'ansible_search_path' from source: unknown 18911 1727096290.18135: calling self._execute() 18911 1727096290.18560: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.18570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.18574: variable 'omit' from source: magic vars 18911 1727096290.19665: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.19766: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.19922: _execute() done 18911 1727096290.19926: dumping result to json 18911 1727096290.19928: done dumping result, returning 18911 1727096290.19931: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-09a7-aae1-0000000001d3] 18911 1727096290.19932: sending task result for task 0afff68d-5257-09a7-aae1-0000000001d3 18911 1727096290.20212: done sending task result for task 0afff68d-5257-09a7-aae1-0000000001d3 18911 1727096290.20215: WORKER PROCESS EXITING 18911 1727096290.20246: no more pending results, returning what we have 18911 1727096290.20251: in VariableManager get_vars() 18911 1727096290.20297: Calling all_inventory to load vars for managed_node1 18911 1727096290.20300: Calling groups_inventory to load vars for managed_node1 18911 1727096290.20305: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.20318: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.20320: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.20324: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.20724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.21120: done with get_vars() 18911 1727096290.21128: variable 'ansible_search_path' from source: unknown 18911 1727096290.21129: variable 'ansible_search_path' from source: unknown 18911 1727096290.21160: we have included files to process 18911 1727096290.21162: generating all_blocks data 18911 1727096290.21163: done generating all_blocks data 18911 1727096290.21165: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096290.21165: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096290.21270: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096290.21854: done processing included file 18911 1727096290.21857: iterating over new_blocks loaded from include file 18911 1727096290.21859: in VariableManager get_vars() 18911 1727096290.21892: done with get_vars() 18911 1727096290.21895: filtering new block on tags 18911 1727096290.21910: done filtering new block on tags 18911 1727096290.21913: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18911 1727096290.21918: extending task lists for all hosts with included blocks 18911 1727096290.22037: done extending task lists 18911 1727096290.22039: done processing included files 18911 1727096290.22039: results queue empty 18911 1727096290.22040: checking for any_errors_fatal 18911 1727096290.22043: done checking for any_errors_fatal 18911 1727096290.22043: checking for max_fail_percentage 18911 1727096290.22045: done checking for max_fail_percentage 18911 1727096290.22045: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.22046: done checking to see if all hosts have failed 18911 1727096290.22047: getting the remaining hosts for this loop 18911 1727096290.22048: done getting the remaining hosts for this loop 18911 1727096290.22051: getting the next task for host managed_node1 18911 1727096290.22289: done getting next task for host managed_node1 18911 1727096290.22292: ^ task is: TASK: Get stat for interface {{ interface }} 18911 1727096290.22296: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.22299: getting variables 18911 1727096290.22300: in VariableManager get_vars() 18911 1727096290.22310: Calling all_inventory to load vars for managed_node1 18911 1727096290.22312: Calling groups_inventory to load vars for managed_node1 18911 1727096290.22315: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.22321: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.22323: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.22326: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.22744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.23039: done with get_vars() 18911 1727096290.23162: done getting variables 18911 1727096290.23425: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:58:10 -0400 (0:00:00.070) 0:00:09.348 ****** 18911 1727096290.23457: entering _queue_task() for managed_node1/stat 18911 1727096290.24194: worker is 1 (out of 1 available) 18911 1727096290.24208: exiting _queue_task() for managed_node1/stat 18911 1727096290.24223: done queuing things up, now waiting for results queue to drain 18911 1727096290.24225: waiting for pending results... 18911 1727096290.24729: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18911 1727096290.24812: in run() - task 0afff68d-5257-09a7-aae1-00000000021e 18911 1727096290.25273: variable 'ansible_search_path' from source: unknown 18911 1727096290.25277: variable 'ansible_search_path' from source: unknown 18911 1727096290.25280: calling self._execute() 18911 1727096290.25283: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.25285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.25287: variable 'omit' from source: magic vars 18911 1727096290.25903: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.25980: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.25994: variable 'omit' from source: magic vars 18911 1727096290.26043: variable 'omit' from source: magic vars 18911 1727096290.26391: variable 'interface' from source: set_fact 18911 1727096290.26402: variable 'omit' from source: magic vars 18911 1727096290.26447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096290.26491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096290.26673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096290.26676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.26679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.26708: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096290.26827: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.26830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.27176: Set connection var ansible_shell_executable to /bin/sh 18911 1727096290.27179: Set connection var ansible_timeout to 10 18911 1727096290.27182: Set connection var ansible_shell_type to sh 18911 1727096290.27184: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096290.27186: Set connection var ansible_pipelining to False 18911 1727096290.27188: Set connection var ansible_connection to ssh 18911 1727096290.27190: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.27193: variable 'ansible_connection' from source: unknown 18911 1727096290.27195: variable 'ansible_module_compression' from source: unknown 18911 1727096290.27197: variable 'ansible_shell_type' from source: unknown 18911 1727096290.27199: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.27201: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.27204: variable 'ansible_pipelining' from source: unknown 18911 1727096290.27206: variable 'ansible_timeout' from source: unknown 18911 1727096290.27208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.27542: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096290.27559: variable 'omit' from source: magic vars 18911 1727096290.27576: starting attempt loop 18911 1727096290.27614: running the handler 18911 1727096290.27635: _low_level_execute_command(): starting 18911 1727096290.27682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096290.28998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.29017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096290.29038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.29245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.29253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.29474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.31202: stdout chunk (state=3): >>>/root <<< 18911 1727096290.31331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.31529: stderr chunk (state=3): >>><<< 18911 1727096290.31532: stdout chunk (state=3): >>><<< 18911 1727096290.31537: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.31539: _low_level_execute_command(): starting 18911 1727096290.31542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709 `" && echo ansible-tmp-1727096290.314552-19391-87019713561709="` echo /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709 `" ) && sleep 0' 18911 1727096290.32533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096290.32547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096290.32564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.32617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.32692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.32707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.32729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.32834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.34824: stdout chunk (state=3): >>>ansible-tmp-1727096290.314552-19391-87019713561709=/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709 <<< 18911 1727096290.35175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.35179: stdout chunk (state=3): >>><<< 18911 1727096290.35181: stderr chunk (state=3): >>><<< 18911 1727096290.35183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096290.314552-19391-87019713561709=/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.35186: variable 'ansible_module_compression' from source: unknown 18911 1727096290.35188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18911 1727096290.35405: variable 'ansible_facts' from source: unknown 18911 1727096290.35535: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py 18911 1727096290.35758: Sending initial data 18911 1727096290.35761: Sent initial data (151 bytes) 18911 1727096290.36572: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.36577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096290.36612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096290.36619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.36675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.38309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096290.38372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096290.38476: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdg6d7x4l /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py <<< 18911 1727096290.38482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py" <<< 18911 1727096290.38545: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdg6d7x4l" to remote "/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py" <<< 18911 1727096290.39474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.39573: stderr chunk (state=3): >>><<< 18911 1727096290.39577: stdout chunk (state=3): >>><<< 18911 1727096290.39579: done transferring module to remote 18911 1727096290.39581: _low_level_execute_command(): starting 18911 1727096290.39583: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/ /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py && sleep 0' 18911 1727096290.39965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.39989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.39992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.40043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.40114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.41965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.41975: stdout chunk (state=3): >>><<< 18911 1727096290.41978: stderr chunk (state=3): >>><<< 18911 1727096290.41982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.41989: _low_level_execute_command(): starting 18911 1727096290.41993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/AnsiballZ_stat.py && sleep 0' 18911 1727096290.42422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.42426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.42428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096290.42430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096290.42434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.42472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.42496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.42499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.42573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.58129: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28165, "dev": 23, "nlink": 1, "atime": 1727096288.512886, "mtime": 1727096288.512886, "ctime": 1727096288.512886, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18911 1727096290.59496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096290.59515: stderr chunk (state=3): >>><<< 18911 1727096290.59518: stdout chunk (state=3): >>><<< 18911 1727096290.59533: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28165, "dev": 23, "nlink": 1, "atime": 1727096288.512886, "mtime": 1727096288.512886, "ctime": 1727096288.512886, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096290.59575: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096290.59585: _low_level_execute_command(): starting 18911 1727096290.59590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096290.314552-19391-87019713561709/ > /dev/null 2>&1 && sleep 0' 18911 1727096290.60048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096290.60051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.60054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096290.60056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096290.60058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.60108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.60115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.60117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.60186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.62382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.62398: stdout chunk (state=3): >>><<< 18911 1727096290.62401: stderr chunk (state=3): >>><<< 18911 1727096290.62408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.62410: handler run complete 18911 1727096290.62412: attempt loop complete, returning result 18911 1727096290.62414: _execute() done 18911 1727096290.62417: dumping result to json 18911 1727096290.62419: done dumping result, returning 18911 1727096290.62420: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0afff68d-5257-09a7-aae1-00000000021e] 18911 1727096290.62432: sending task result for task 0afff68d-5257-09a7-aae1-00000000021e ok: [managed_node1] => { "changed": false, "stat": { "atime": 1727096288.512886, "block_size": 4096, "blocks": 0, "ctime": 1727096288.512886, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28165, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1727096288.512886, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 18911 1727096290.62963: no more pending results, returning what we have 18911 1727096290.63118: results queue empty 18911 1727096290.63120: checking for any_errors_fatal 18911 1727096290.63122: done checking for any_errors_fatal 18911 1727096290.63122: checking for max_fail_percentage 18911 1727096290.63124: done checking for max_fail_percentage 18911 1727096290.63125: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.63126: done checking to see if all hosts have failed 18911 1727096290.63127: getting the remaining hosts for this loop 18911 1727096290.63128: done getting the remaining hosts for this loop 18911 1727096290.63131: getting the next task for host managed_node1 18911 1727096290.63140: done getting next task for host managed_node1 18911 1727096290.63143: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 18911 1727096290.63147: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.63150: getting variables 18911 1727096290.63152: in VariableManager get_vars() 18911 1727096290.63188: Calling all_inventory to load vars for managed_node1 18911 1727096290.63191: Calling groups_inventory to load vars for managed_node1 18911 1727096290.63194: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.63208: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.63211: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.63214: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.63441: done sending task result for task 0afff68d-5257-09a7-aae1-00000000021e 18911 1727096290.63444: WORKER PROCESS EXITING 18911 1727096290.63865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.64118: done with get_vars() 18911 1727096290.64157: done getting variables 18911 1727096290.64266: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18911 1727096290.64395: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:58:10 -0400 (0:00:00.409) 0:00:09.758 ****** 18911 1727096290.64429: entering _queue_task() for managed_node1/assert 18911 1727096290.64430: Creating lock for assert 18911 1727096290.64946: worker is 1 (out of 1 available) 18911 1727096290.65303: exiting _queue_task() for managed_node1/assert 18911 1727096290.65314: done queuing things up, now waiting for results queue to drain 18911 1727096290.65316: waiting for pending results... 18911 1727096290.65637: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' 18911 1727096290.65754: in run() - task 0afff68d-5257-09a7-aae1-0000000001d4 18911 1727096290.65787: variable 'ansible_search_path' from source: unknown 18911 1727096290.65893: variable 'ansible_search_path' from source: unknown 18911 1727096290.65900: calling self._execute() 18911 1727096290.66101: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.66115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.66130: variable 'omit' from source: magic vars 18911 1727096290.67086: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.67104: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.67114: variable 'omit' from source: magic vars 18911 1727096290.67174: variable 'omit' from source: magic vars 18911 1727096290.67283: variable 'interface' from source: set_fact 18911 1727096290.67304: variable 'omit' from source: magic vars 18911 1727096290.67366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096290.67406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096290.67474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096290.67478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.67480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.67513: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096290.67522: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.67530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.67643: Set connection var ansible_shell_executable to /bin/sh 18911 1727096290.67647: Set connection var ansible_timeout to 10 18911 1727096290.67649: Set connection var ansible_shell_type to sh 18911 1727096290.67656: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096290.67661: Set connection var ansible_pipelining to False 18911 1727096290.67673: Set connection var ansible_connection to ssh 18911 1727096290.67697: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.67701: variable 'ansible_connection' from source: unknown 18911 1727096290.67703: variable 'ansible_module_compression' from source: unknown 18911 1727096290.67705: variable 'ansible_shell_type' from source: unknown 18911 1727096290.67707: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.67710: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.67713: variable 'ansible_pipelining' from source: unknown 18911 1727096290.67716: variable 'ansible_timeout' from source: unknown 18911 1727096290.67720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.67827: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096290.67836: variable 'omit' from source: magic vars 18911 1727096290.67842: starting attempt loop 18911 1727096290.67845: running the handler 18911 1727096290.67953: variable 'interface_stat' from source: set_fact 18911 1727096290.67971: Evaluated conditional (interface_stat.stat.exists): True 18911 1727096290.67976: handler run complete 18911 1727096290.67987: attempt loop complete, returning result 18911 1727096290.67989: _execute() done 18911 1727096290.67993: dumping result to json 18911 1727096290.67996: done dumping result, returning 18911 1727096290.68003: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' [0afff68d-5257-09a7-aae1-0000000001d4] 18911 1727096290.68006: sending task result for task 0afff68d-5257-09a7-aae1-0000000001d4 18911 1727096290.68086: done sending task result for task 0afff68d-5257-09a7-aae1-0000000001d4 18911 1727096290.68089: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18911 1727096290.68160: no more pending results, returning what we have 18911 1727096290.68164: results queue empty 18911 1727096290.68165: checking for any_errors_fatal 18911 1727096290.68176: done checking for any_errors_fatal 18911 1727096290.68177: checking for max_fail_percentage 18911 1727096290.68179: done checking for max_fail_percentage 18911 1727096290.68180: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.68180: done checking to see if all hosts have failed 18911 1727096290.68181: getting the remaining hosts for this loop 18911 1727096290.68182: done getting the remaining hosts for this loop 18911 1727096290.68185: getting the next task for host managed_node1 18911 1727096290.68193: done getting next task for host managed_node1 18911 1727096290.68195: ^ task is: TASK: meta (flush_handlers) 18911 1727096290.68198: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.68201: getting variables 18911 1727096290.68203: in VariableManager get_vars() 18911 1727096290.68232: Calling all_inventory to load vars for managed_node1 18911 1727096290.68235: Calling groups_inventory to load vars for managed_node1 18911 1727096290.68239: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.68249: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.68251: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.68254: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.68395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.68511: done with get_vars() 18911 1727096290.68519: done getting variables 18911 1727096290.68566: in VariableManager get_vars() 18911 1727096290.68575: Calling all_inventory to load vars for managed_node1 18911 1727096290.68577: Calling groups_inventory to load vars for managed_node1 18911 1727096290.68578: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.68582: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.68583: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.68585: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.68693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.68820: done with get_vars() 18911 1727096290.68858: done queuing things up, now waiting for results queue to drain 18911 1727096290.68860: results queue empty 18911 1727096290.68860: checking for any_errors_fatal 18911 1727096290.68871: done checking for any_errors_fatal 18911 1727096290.68872: checking for max_fail_percentage 18911 1727096290.68874: done checking for max_fail_percentage 18911 1727096290.68874: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.68875: done checking to see if all hosts have failed 18911 1727096290.68881: getting the remaining hosts for this loop 18911 1727096290.68882: done getting the remaining hosts for this loop 18911 1727096290.68884: getting the next task for host managed_node1 18911 1727096290.68888: done getting next task for host managed_node1 18911 1727096290.68890: ^ task is: TASK: meta (flush_handlers) 18911 1727096290.68891: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.68893: getting variables 18911 1727096290.68894: in VariableManager get_vars() 18911 1727096290.68901: Calling all_inventory to load vars for managed_node1 18911 1727096290.68903: Calling groups_inventory to load vars for managed_node1 18911 1727096290.68905: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.68910: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.68912: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.68915: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.69046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.69546: done with get_vars() 18911 1727096290.69553: done getting variables 18911 1727096290.69772: in VariableManager get_vars() 18911 1727096290.69780: Calling all_inventory to load vars for managed_node1 18911 1727096290.69782: Calling groups_inventory to load vars for managed_node1 18911 1727096290.69785: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.69789: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.69791: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.69794: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.70044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.70903: done with get_vars() 18911 1727096290.70917: done queuing things up, now waiting for results queue to drain 18911 1727096290.71030: results queue empty 18911 1727096290.71032: checking for any_errors_fatal 18911 1727096290.71034: done checking for any_errors_fatal 18911 1727096290.71034: checking for max_fail_percentage 18911 1727096290.71036: done checking for max_fail_percentage 18911 1727096290.71036: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.71038: done checking to see if all hosts have failed 18911 1727096290.71038: getting the remaining hosts for this loop 18911 1727096290.71039: done getting the remaining hosts for this loop 18911 1727096290.71042: getting the next task for host managed_node1 18911 1727096290.71045: done getting next task for host managed_node1 18911 1727096290.71046: ^ task is: None 18911 1727096290.71047: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.71048: done queuing things up, now waiting for results queue to drain 18911 1727096290.71049: results queue empty 18911 1727096290.71050: checking for any_errors_fatal 18911 1727096290.71051: done checking for any_errors_fatal 18911 1727096290.71051: checking for max_fail_percentage 18911 1727096290.71052: done checking for max_fail_percentage 18911 1727096290.71053: checking to see if all hosts have failed and the running result is not ok 18911 1727096290.71054: done checking to see if all hosts have failed 18911 1727096290.71055: getting the next task for host managed_node1 18911 1727096290.71057: done getting next task for host managed_node1 18911 1727096290.71058: ^ task is: None 18911 1727096290.71059: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.71284: in VariableManager get_vars() 18911 1727096290.71312: done with get_vars() 18911 1727096290.71318: in VariableManager get_vars() 18911 1727096290.71329: done with get_vars() 18911 1727096290.71333: variable 'omit' from source: magic vars 18911 1727096290.71362: in VariableManager get_vars() 18911 1727096290.71377: done with get_vars() 18911 1727096290.71397: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18911 1727096290.73904: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096290.74377: getting the remaining hosts for this loop 18911 1727096290.74379: done getting the remaining hosts for this loop 18911 1727096290.74381: getting the next task for host managed_node1 18911 1727096290.74384: done getting next task for host managed_node1 18911 1727096290.74386: ^ task is: TASK: Gathering Facts 18911 1727096290.74388: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096290.74390: getting variables 18911 1727096290.74391: in VariableManager get_vars() 18911 1727096290.74405: Calling all_inventory to load vars for managed_node1 18911 1727096290.74407: Calling groups_inventory to load vars for managed_node1 18911 1727096290.74409: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096290.74414: Calling all_plugins_play to load vars for managed_node1 18911 1727096290.74417: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096290.74419: Calling groups_plugins_play to load vars for managed_node1 18911 1727096290.74833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096290.75693: done with get_vars() 18911 1727096290.75703: done getting variables 18911 1727096290.75745: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Monday 23 September 2024 08:58:10 -0400 (0:00:00.113) 0:00:09.872 ****** 18911 1727096290.75774: entering _queue_task() for managed_node1/gather_facts 18911 1727096290.77041: worker is 1 (out of 1 available) 18911 1727096290.77054: exiting _queue_task() for managed_node1/gather_facts 18911 1727096290.77069: done queuing things up, now waiting for results queue to drain 18911 1727096290.77071: waiting for pending results... 18911 1727096290.77849: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096290.78133: in run() - task 0afff68d-5257-09a7-aae1-000000000237 18911 1727096290.78137: variable 'ansible_search_path' from source: unknown 18911 1727096290.78156: calling self._execute() 18911 1727096290.78759: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.78762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.78766: variable 'omit' from source: magic vars 18911 1727096290.79684: variable 'ansible_distribution_major_version' from source: facts 18911 1727096290.79689: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096290.79692: variable 'omit' from source: magic vars 18911 1727096290.79695: variable 'omit' from source: magic vars 18911 1727096290.79951: variable 'omit' from source: magic vars 18911 1727096290.79955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096290.79958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096290.80082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096290.80104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.80120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096290.80155: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096290.80178: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.80373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.80422: Set connection var ansible_shell_executable to /bin/sh 18911 1727096290.80505: Set connection var ansible_timeout to 10 18911 1727096290.80515: Set connection var ansible_shell_type to sh 18911 1727096290.80529: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096290.80540: Set connection var ansible_pipelining to False 18911 1727096290.80553: Set connection var ansible_connection to ssh 18911 1727096290.80634: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.80772: variable 'ansible_connection' from source: unknown 18911 1727096290.80775: variable 'ansible_module_compression' from source: unknown 18911 1727096290.80778: variable 'ansible_shell_type' from source: unknown 18911 1727096290.80780: variable 'ansible_shell_executable' from source: unknown 18911 1727096290.80781: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096290.80783: variable 'ansible_pipelining' from source: unknown 18911 1727096290.80785: variable 'ansible_timeout' from source: unknown 18911 1727096290.80787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096290.81180: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096290.81197: variable 'omit' from source: magic vars 18911 1727096290.81207: starting attempt loop 18911 1727096290.81215: running the handler 18911 1727096290.81236: variable 'ansible_facts' from source: unknown 18911 1727096290.81483: _low_level_execute_command(): starting 18911 1727096290.81488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096290.83272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.83277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.83678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.85288: stdout chunk (state=3): >>>/root <<< 18911 1727096290.85412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.85451: stderr chunk (state=3): >>><<< 18911 1727096290.85660: stdout chunk (state=3): >>><<< 18911 1727096290.85665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.85670: _low_level_execute_command(): starting 18911 1727096290.85675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357 `" && echo ansible-tmp-1727096290.8558462-19422-182276346172357="` echo /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357 `" ) && sleep 0' 18911 1727096290.87199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.87215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.87349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.89507: stdout chunk (state=3): >>>ansible-tmp-1727096290.8558462-19422-182276346172357=/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357 <<< 18911 1727096290.89529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.89587: stderr chunk (state=3): >>><<< 18911 1727096290.89598: stdout chunk (state=3): >>><<< 18911 1727096290.89623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096290.8558462-19422-182276346172357=/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096290.89978: variable 'ansible_module_compression' from source: unknown 18911 1727096290.89981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096290.90092: variable 'ansible_facts' from source: unknown 18911 1727096290.90401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py 18911 1727096290.90822: Sending initial data 18911 1727096290.90880: Sent initial data (154 bytes) 18911 1727096290.92386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096290.92681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096290.92708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.92812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096290.94458: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096290.94478: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18911 1727096290.94510: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096290.94566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096290.94729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpobvtz3ay /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py <<< 18911 1727096290.94738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py" <<< 18911 1727096290.94826: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpobvtz3ay" to remote "/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py" <<< 18911 1727096290.97801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096290.97856: stdout chunk (state=3): >>><<< 18911 1727096290.97871: stderr chunk (state=3): >>><<< 18911 1727096290.97937: done transferring module to remote 18911 1727096290.97952: _low_level_execute_command(): starting 18911 1727096290.97970: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/ /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py && sleep 0' 18911 1727096290.99257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.99261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096290.99531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096290.99547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096290.99645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096291.01551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096291.01595: stdout chunk (state=3): >>><<< 18911 1727096291.01598: stderr chunk (state=3): >>><<< 18911 1727096291.01616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096291.01628: _low_level_execute_command(): starting 18911 1727096291.01640: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/AnsiballZ_setup.py && sleep 0' 18911 1727096291.02488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096291.02492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096291.02496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096291.02566: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096291.02577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096291.02617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096291.68351: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.52978515625, "5m": 0.3623046875, "15m": 0.17822265625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "11", "epoch": "1727096291", "epoch_int": "1727096291", "date": "2024-09-23", "time": "08:58:11", "iso8601_micro": "2024-09-23T12:58:11.305482Z", "iso8601": "2024-09-23T12:58:11Z", "iso8601_basic": "20240923T085811305482", "iso8601_basic_short": "20240923T085811", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 444, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795401728, "block_size": 4096, "block_total": 65519099, "block_available": 63914893, "block_used": 1604206, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "lsr27", "peerlsr27", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"addre<<< 18911 1727096291.68392: stdout chunk (state=3): >>>ss": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096291.70458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096291.70466: stdout chunk (state=3): >>><<< 18911 1727096291.70471: stderr chunk (state=3): >>><<< 18911 1727096291.70675: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.52978515625, "5m": 0.3623046875, "15m": 0.17822265625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "11", "epoch": "1727096291", "epoch_int": "1727096291", "date": "2024-09-23", "time": "08:58:11", "iso8601_micro": "2024-09-23T12:58:11.305482Z", "iso8601": "2024-09-23T12:58:11Z", "iso8601_basic": "20240923T085811305482", "iso8601_basic_short": "20240923T085811", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 444, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795401728, "block_size": 4096, "block_total": 65519099, "block_available": 63914893, "block_used": 1604206, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "lsr27", "peerlsr27", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096291.71259: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096291.71292: _low_level_execute_command(): starting 18911 1727096291.71303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096290.8558462-19422-182276346172357/ > /dev/null 2>&1 && sleep 0' 18911 1727096291.71947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096291.71960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096291.71983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096291.72002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096291.72019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096291.72121: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096291.72142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096291.72241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096291.74129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096291.74191: stderr chunk (state=3): >>><<< 18911 1727096291.74213: stdout chunk (state=3): >>><<< 18911 1727096291.74243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096291.74257: handler run complete 18911 1727096291.74436: variable 'ansible_facts' from source: unknown 18911 1727096291.74564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.75051: variable 'ansible_facts' from source: unknown 18911 1727096291.75101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.75266: attempt loop complete, returning result 18911 1727096291.75287: _execute() done 18911 1727096291.75296: dumping result to json 18911 1727096291.75338: done dumping result, returning 18911 1727096291.75350: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-000000000237] 18911 1727096291.75358: sending task result for task 0afff68d-5257-09a7-aae1-000000000237 18911 1727096291.76494: done sending task result for task 0afff68d-5257-09a7-aae1-000000000237 18911 1727096291.76498: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096291.76890: no more pending results, returning what we have 18911 1727096291.76893: results queue empty 18911 1727096291.76894: checking for any_errors_fatal 18911 1727096291.76896: done checking for any_errors_fatal 18911 1727096291.76897: checking for max_fail_percentage 18911 1727096291.76898: done checking for max_fail_percentage 18911 1727096291.76899: checking to see if all hosts have failed and the running result is not ok 18911 1727096291.76900: done checking to see if all hosts have failed 18911 1727096291.76901: getting the remaining hosts for this loop 18911 1727096291.76902: done getting the remaining hosts for this loop 18911 1727096291.76906: getting the next task for host managed_node1 18911 1727096291.76911: done getting next task for host managed_node1 18911 1727096291.76913: ^ task is: TASK: meta (flush_handlers) 18911 1727096291.76914: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096291.76918: getting variables 18911 1727096291.76919: in VariableManager get_vars() 18911 1727096291.76945: Calling all_inventory to load vars for managed_node1 18911 1727096291.76955: Calling groups_inventory to load vars for managed_node1 18911 1727096291.76958: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.76973: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.76977: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.76980: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.77158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.77571: done with get_vars() 18911 1727096291.77582: done getting variables 18911 1727096291.77653: in VariableManager get_vars() 18911 1727096291.77669: Calling all_inventory to load vars for managed_node1 18911 1727096291.77672: Calling groups_inventory to load vars for managed_node1 18911 1727096291.77674: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.77678: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.77681: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.77683: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.77879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.78135: done with get_vars() 18911 1727096291.78148: done queuing things up, now waiting for results queue to drain 18911 1727096291.78171: results queue empty 18911 1727096291.78172: checking for any_errors_fatal 18911 1727096291.78178: done checking for any_errors_fatal 18911 1727096291.78179: checking for max_fail_percentage 18911 1727096291.78185: done checking for max_fail_percentage 18911 1727096291.78186: checking to see if all hosts have failed and the running result is not ok 18911 1727096291.78187: done checking to see if all hosts have failed 18911 1727096291.78188: getting the remaining hosts for this loop 18911 1727096291.78189: done getting the remaining hosts for this loop 18911 1727096291.78191: getting the next task for host managed_node1 18911 1727096291.78195: done getting next task for host managed_node1 18911 1727096291.78198: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096291.78199: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096291.78209: getting variables 18911 1727096291.78210: in VariableManager get_vars() 18911 1727096291.78222: Calling all_inventory to load vars for managed_node1 18911 1727096291.78224: Calling groups_inventory to load vars for managed_node1 18911 1727096291.78226: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.78230: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.78233: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.78235: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.78407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.78712: done with get_vars() 18911 1727096291.78719: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:58:11 -0400 (0:00:01.030) 0:00:10.902 ****** 18911 1727096291.78827: entering _queue_task() for managed_node1/include_tasks 18911 1727096291.79209: worker is 1 (out of 1 available) 18911 1727096291.79221: exiting _queue_task() for managed_node1/include_tasks 18911 1727096291.79232: done queuing things up, now waiting for results queue to drain 18911 1727096291.79234: waiting for pending results... 18911 1727096291.79606: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096291.79646: in run() - task 0afff68d-5257-09a7-aae1-000000000019 18911 1727096291.79672: variable 'ansible_search_path' from source: unknown 18911 1727096291.79681: variable 'ansible_search_path' from source: unknown 18911 1727096291.79728: calling self._execute() 18911 1727096291.79820: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096291.79873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096291.79877: variable 'omit' from source: magic vars 18911 1727096291.80249: variable 'ansible_distribution_major_version' from source: facts 18911 1727096291.80276: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096291.80286: _execute() done 18911 1727096291.80294: dumping result to json 18911 1727096291.80302: done dumping result, returning 18911 1727096291.80313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-09a7-aae1-000000000019] 18911 1727096291.80355: sending task result for task 0afff68d-5257-09a7-aae1-000000000019 18911 1727096291.80586: no more pending results, returning what we have 18911 1727096291.80591: in VariableManager get_vars() 18911 1727096291.80636: Calling all_inventory to load vars for managed_node1 18911 1727096291.80640: Calling groups_inventory to load vars for managed_node1 18911 1727096291.80643: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.80656: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.80660: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.80666: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.80682: done sending task result for task 0afff68d-5257-09a7-aae1-000000000019 18911 1727096291.80685: WORKER PROCESS EXITING 18911 1727096291.81180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.81417: done with get_vars() 18911 1727096291.81426: variable 'ansible_search_path' from source: unknown 18911 1727096291.81427: variable 'ansible_search_path' from source: unknown 18911 1727096291.81457: we have included files to process 18911 1727096291.81458: generating all_blocks data 18911 1727096291.81459: done generating all_blocks data 18911 1727096291.81460: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096291.81463: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096291.81465: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096291.82246: done processing included file 18911 1727096291.82248: iterating over new_blocks loaded from include file 18911 1727096291.82250: in VariableManager get_vars() 18911 1727096291.82273: done with get_vars() 18911 1727096291.82275: filtering new block on tags 18911 1727096291.82293: done filtering new block on tags 18911 1727096291.82297: in VariableManager get_vars() 18911 1727096291.82483: done with get_vars() 18911 1727096291.82485: filtering new block on tags 18911 1727096291.82504: done filtering new block on tags 18911 1727096291.82507: in VariableManager get_vars() 18911 1727096291.82526: done with get_vars() 18911 1727096291.82528: filtering new block on tags 18911 1727096291.82544: done filtering new block on tags 18911 1727096291.82546: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18911 1727096291.82551: extending task lists for all hosts with included blocks 18911 1727096291.83041: done extending task lists 18911 1727096291.83042: done processing included files 18911 1727096291.83043: results queue empty 18911 1727096291.83044: checking for any_errors_fatal 18911 1727096291.83045: done checking for any_errors_fatal 18911 1727096291.83046: checking for max_fail_percentage 18911 1727096291.83047: done checking for max_fail_percentage 18911 1727096291.83048: checking to see if all hosts have failed and the running result is not ok 18911 1727096291.83049: done checking to see if all hosts have failed 18911 1727096291.83049: getting the remaining hosts for this loop 18911 1727096291.83050: done getting the remaining hosts for this loop 18911 1727096291.83053: getting the next task for host managed_node1 18911 1727096291.83056: done getting next task for host managed_node1 18911 1727096291.83059: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096291.83061: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096291.83072: getting variables 18911 1727096291.83073: in VariableManager get_vars() 18911 1727096291.83088: Calling all_inventory to load vars for managed_node1 18911 1727096291.83090: Calling groups_inventory to load vars for managed_node1 18911 1727096291.83092: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.83097: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.83100: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.83108: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.83308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.83542: done with get_vars() 18911 1727096291.83552: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:58:11 -0400 (0:00:00.047) 0:00:10.950 ****** 18911 1727096291.83626: entering _queue_task() for managed_node1/setup 18911 1727096291.84329: worker is 1 (out of 1 available) 18911 1727096291.84342: exiting _queue_task() for managed_node1/setup 18911 1727096291.84353: done queuing things up, now waiting for results queue to drain 18911 1727096291.84354: waiting for pending results... 18911 1727096291.84694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096291.84821: in run() - task 0afff68d-5257-09a7-aae1-000000000279 18911 1727096291.84844: variable 'ansible_search_path' from source: unknown 18911 1727096291.84852: variable 'ansible_search_path' from source: unknown 18911 1727096291.84898: calling self._execute() 18911 1727096291.84994: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096291.85003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096291.85101: variable 'omit' from source: magic vars 18911 1727096291.85628: variable 'ansible_distribution_major_version' from source: facts 18911 1727096291.85646: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096291.85903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096291.88376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096291.88459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096291.88673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096291.88677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096291.88694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096291.88848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096291.88918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096291.88948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096291.89000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096291.89019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096291.89086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096291.89117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096291.89190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096291.89211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096291.89230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096291.89420: variable '__network_required_facts' from source: role '' defaults 18911 1727096291.89438: variable 'ansible_facts' from source: unknown 18911 1727096291.89625: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18911 1727096291.89629: when evaluation is False, skipping this task 18911 1727096291.89631: _execute() done 18911 1727096291.89633: dumping result to json 18911 1727096291.89635: done dumping result, returning 18911 1727096291.89638: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-09a7-aae1-000000000279] 18911 1727096291.89640: sending task result for task 0afff68d-5257-09a7-aae1-000000000279 18911 1727096291.89715: done sending task result for task 0afff68d-5257-09a7-aae1-000000000279 18911 1727096291.89718: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096291.89790: no more pending results, returning what we have 18911 1727096291.89794: results queue empty 18911 1727096291.89796: checking for any_errors_fatal 18911 1727096291.89797: done checking for any_errors_fatal 18911 1727096291.89798: checking for max_fail_percentage 18911 1727096291.89800: done checking for max_fail_percentage 18911 1727096291.89801: checking to see if all hosts have failed and the running result is not ok 18911 1727096291.89802: done checking to see if all hosts have failed 18911 1727096291.89803: getting the remaining hosts for this loop 18911 1727096291.89804: done getting the remaining hosts for this loop 18911 1727096291.89807: getting the next task for host managed_node1 18911 1727096291.89817: done getting next task for host managed_node1 18911 1727096291.89822: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096291.89825: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096291.89844: getting variables 18911 1727096291.89847: in VariableManager get_vars() 18911 1727096291.89893: Calling all_inventory to load vars for managed_node1 18911 1727096291.89897: Calling groups_inventory to load vars for managed_node1 18911 1727096291.89899: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.89910: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.89913: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.89915: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.90431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.90689: done with get_vars() 18911 1727096291.90701: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:58:11 -0400 (0:00:00.071) 0:00:11.022 ****** 18911 1727096291.90798: entering _queue_task() for managed_node1/stat 18911 1727096291.91063: worker is 1 (out of 1 available) 18911 1727096291.91277: exiting _queue_task() for managed_node1/stat 18911 1727096291.91287: done queuing things up, now waiting for results queue to drain 18911 1727096291.91288: waiting for pending results... 18911 1727096291.91795: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096291.91799: in run() - task 0afff68d-5257-09a7-aae1-00000000027b 18911 1727096291.91802: variable 'ansible_search_path' from source: unknown 18911 1727096291.91804: variable 'ansible_search_path' from source: unknown 18911 1727096291.91921: calling self._execute() 18911 1727096291.92123: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096291.92139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096291.92151: variable 'omit' from source: magic vars 18911 1727096291.92848: variable 'ansible_distribution_major_version' from source: facts 18911 1727096291.93073: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096291.93256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096291.93969: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096291.94024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096291.94064: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096291.94105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096291.94309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096291.94335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096291.94360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096291.94474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096291.94571: variable '__network_is_ostree' from source: set_fact 18911 1727096291.94627: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096291.94833: when evaluation is False, skipping this task 18911 1727096291.94837: _execute() done 18911 1727096291.94840: dumping result to json 18911 1727096291.94842: done dumping result, returning 18911 1727096291.94845: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-09a7-aae1-00000000027b] 18911 1727096291.94847: sending task result for task 0afff68d-5257-09a7-aae1-00000000027b 18911 1727096291.94919: done sending task result for task 0afff68d-5257-09a7-aae1-00000000027b 18911 1727096291.94922: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096291.94995: no more pending results, returning what we have 18911 1727096291.94999: results queue empty 18911 1727096291.95001: checking for any_errors_fatal 18911 1727096291.95010: done checking for any_errors_fatal 18911 1727096291.95011: checking for max_fail_percentage 18911 1727096291.95012: done checking for max_fail_percentage 18911 1727096291.95013: checking to see if all hosts have failed and the running result is not ok 18911 1727096291.95014: done checking to see if all hosts have failed 18911 1727096291.95015: getting the remaining hosts for this loop 18911 1727096291.95016: done getting the remaining hosts for this loop 18911 1727096291.95019: getting the next task for host managed_node1 18911 1727096291.95027: done getting next task for host managed_node1 18911 1727096291.95031: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096291.95033: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096291.95051: getting variables 18911 1727096291.95053: in VariableManager get_vars() 18911 1727096291.95101: Calling all_inventory to load vars for managed_node1 18911 1727096291.95104: Calling groups_inventory to load vars for managed_node1 18911 1727096291.95107: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096291.95118: Calling all_plugins_play to load vars for managed_node1 18911 1727096291.95121: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096291.95124: Calling groups_plugins_play to load vars for managed_node1 18911 1727096291.95972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096291.96419: done with get_vars() 18911 1727096291.96433: done getting variables 18911 1727096291.96503: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:58:11 -0400 (0:00:00.057) 0:00:11.079 ****** 18911 1727096291.96537: entering _queue_task() for managed_node1/set_fact 18911 1727096291.97346: worker is 1 (out of 1 available) 18911 1727096291.97360: exiting _queue_task() for managed_node1/set_fact 18911 1727096291.97379: done queuing things up, now waiting for results queue to drain 18911 1727096291.97380: waiting for pending results... 18911 1727096291.97890: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096291.97998: in run() - task 0afff68d-5257-09a7-aae1-00000000027c 18911 1727096291.98131: variable 'ansible_search_path' from source: unknown 18911 1727096291.98134: variable 'ansible_search_path' from source: unknown 18911 1727096291.98170: calling self._execute() 18911 1727096291.98304: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096291.98307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096291.98320: variable 'omit' from source: magic vars 18911 1727096291.99325: variable 'ansible_distribution_major_version' from source: facts 18911 1727096291.99448: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096291.99714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096292.00487: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096292.00536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096292.00568: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096292.00713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096292.00904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096292.00934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096292.00960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096292.00987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096292.01255: variable '__network_is_ostree' from source: set_fact 18911 1727096292.01265: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096292.01271: when evaluation is False, skipping this task 18911 1727096292.01274: _execute() done 18911 1727096292.01276: dumping result to json 18911 1727096292.01278: done dumping result, returning 18911 1727096292.01298: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-09a7-aae1-00000000027c] 18911 1727096292.01301: sending task result for task 0afff68d-5257-09a7-aae1-00000000027c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096292.01536: no more pending results, returning what we have 18911 1727096292.01539: results queue empty 18911 1727096292.01540: checking for any_errors_fatal 18911 1727096292.01547: done checking for any_errors_fatal 18911 1727096292.01547: checking for max_fail_percentage 18911 1727096292.01549: done checking for max_fail_percentage 18911 1727096292.01550: checking to see if all hosts have failed and the running result is not ok 18911 1727096292.01550: done checking to see if all hosts have failed 18911 1727096292.01551: getting the remaining hosts for this loop 18911 1727096292.01552: done getting the remaining hosts for this loop 18911 1727096292.01555: getting the next task for host managed_node1 18911 1727096292.01569: done getting next task for host managed_node1 18911 1727096292.01573: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096292.01576: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096292.01591: getting variables 18911 1727096292.01593: in VariableManager get_vars() 18911 1727096292.01633: Calling all_inventory to load vars for managed_node1 18911 1727096292.01636: Calling groups_inventory to load vars for managed_node1 18911 1727096292.01638: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096292.01649: Calling all_plugins_play to load vars for managed_node1 18911 1727096292.01653: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096292.01656: Calling groups_plugins_play to load vars for managed_node1 18911 1727096292.02250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096292.02783: done sending task result for task 0afff68d-5257-09a7-aae1-00000000027c 18911 1727096292.02786: WORKER PROCESS EXITING 18911 1727096292.02814: done with get_vars() 18911 1727096292.02825: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:58:12 -0400 (0:00:00.065) 0:00:11.144 ****** 18911 1727096292.03042: entering _queue_task() for managed_node1/service_facts 18911 1727096292.03044: Creating lock for service_facts 18911 1727096292.03701: worker is 1 (out of 1 available) 18911 1727096292.03829: exiting _queue_task() for managed_node1/service_facts 18911 1727096292.03841: done queuing things up, now waiting for results queue to drain 18911 1727096292.03843: waiting for pending results... 18911 1727096292.04155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096292.04573: in run() - task 0afff68d-5257-09a7-aae1-00000000027e 18911 1727096292.04577: variable 'ansible_search_path' from source: unknown 18911 1727096292.04580: variable 'ansible_search_path' from source: unknown 18911 1727096292.04582: calling self._execute() 18911 1727096292.04773: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096292.04776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096292.04779: variable 'omit' from source: magic vars 18911 1727096292.05449: variable 'ansible_distribution_major_version' from source: facts 18911 1727096292.05469: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096292.05701: variable 'omit' from source: magic vars 18911 1727096292.05704: variable 'omit' from source: magic vars 18911 1727096292.05706: variable 'omit' from source: magic vars 18911 1727096292.05798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096292.05844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096292.05941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096292.05963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096292.05982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096292.06017: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096292.06172: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096292.06175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096292.06463: Set connection var ansible_shell_executable to /bin/sh 18911 1727096292.06466: Set connection var ansible_timeout to 10 18911 1727096292.06470: Set connection var ansible_shell_type to sh 18911 1727096292.06472: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096292.06474: Set connection var ansible_pipelining to False 18911 1727096292.06477: Set connection var ansible_connection to ssh 18911 1727096292.06478: variable 'ansible_shell_executable' from source: unknown 18911 1727096292.06480: variable 'ansible_connection' from source: unknown 18911 1727096292.06484: variable 'ansible_module_compression' from source: unknown 18911 1727096292.06485: variable 'ansible_shell_type' from source: unknown 18911 1727096292.06487: variable 'ansible_shell_executable' from source: unknown 18911 1727096292.06489: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096292.06491: variable 'ansible_pipelining' from source: unknown 18911 1727096292.06492: variable 'ansible_timeout' from source: unknown 18911 1727096292.06494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096292.06838: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096292.06855: variable 'omit' from source: magic vars 18911 1727096292.06906: starting attempt loop 18911 1727096292.06914: running the handler 18911 1727096292.06932: _low_level_execute_command(): starting 18911 1727096292.06983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096292.08605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096292.08624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096292.08720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096292.08732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096292.08866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096292.08975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096292.10722: stdout chunk (state=3): >>>/root <<< 18911 1727096292.11013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096292.11016: stdout chunk (state=3): >>><<< 18911 1727096292.11018: stderr chunk (state=3): >>><<< 18911 1727096292.11020: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096292.11022: _low_level_execute_command(): starting 18911 1727096292.11025: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345 `" && echo ansible-tmp-1727096292.10916-19512-260032854967345="` echo /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345 `" ) && sleep 0' 18911 1727096292.12397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096292.12400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096292.12403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096292.12406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096292.12892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096292.12983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096292.14960: stdout chunk (state=3): >>>ansible-tmp-1727096292.10916-19512-260032854967345=/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345 <<< 18911 1727096292.15514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096292.15518: stdout chunk (state=3): >>><<< 18911 1727096292.15520: stderr chunk (state=3): >>><<< 18911 1727096292.15525: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096292.10916-19512-260032854967345=/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096292.15660: variable 'ansible_module_compression' from source: unknown 18911 1727096292.15714: ANSIBALLZ: Using lock for service_facts 18911 1727096292.15723: ANSIBALLZ: Acquiring lock 18911 1727096292.15733: ANSIBALLZ: Lock acquired: 140481132645936 18911 1727096292.15741: ANSIBALLZ: Creating module 18911 1727096292.39811: ANSIBALLZ: Writing module into payload 18911 1727096292.39970: ANSIBALLZ: Writing module 18911 1727096292.40176: ANSIBALLZ: Renaming module 18911 1727096292.40180: ANSIBALLZ: Done creating module 18911 1727096292.40182: variable 'ansible_facts' from source: unknown 18911 1727096292.40313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py 18911 1727096292.40775: Sending initial data 18911 1727096292.40878: Sent initial data (160 bytes) 18911 1727096292.42187: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096292.42207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096292.42225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096292.42241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096292.42253: stderr chunk (state=3): >>>debug2: match found <<< 18911 1727096292.42271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096292.42499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096292.42572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096292.44232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096292.44294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096292.44410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmps4qxxzvl /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py <<< 18911 1727096292.44413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py" <<< 18911 1727096292.44494: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmps4qxxzvl" to remote "/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py" <<< 18911 1727096292.46264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096292.46270: stdout chunk (state=3): >>><<< 18911 1727096292.46273: stderr chunk (state=3): >>><<< 18911 1727096292.46275: done transferring module to remote 18911 1727096292.46277: _low_level_execute_command(): starting 18911 1727096292.46279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/ /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py && sleep 0' 18911 1727096292.47170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096292.47185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096292.47199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096292.47221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096292.47326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096292.47352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096292.47465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096292.49427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096292.49431: stdout chunk (state=3): >>><<< 18911 1727096292.49433: stderr chunk (state=3): >>><<< 18911 1727096292.49781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096292.49785: _low_level_execute_command(): starting 18911 1727096292.49787: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/AnsiballZ_service_facts.py && sleep 0' 18911 1727096292.50976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096292.50980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096292.50982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096292.50985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096292.50987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096292.50989: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096292.51149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096292.51373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096292.51398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.11587: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 18911 1727096294.11641: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18911 1727096294.13566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096294.13572: stdout chunk (state=3): >>><<< 18911 1727096294.13575: stderr chunk (state=3): >>><<< 18911 1727096294.13579: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096294.15753: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096294.15757: _low_level_execute_command(): starting 18911 1727096294.15760: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096292.10916-19512-260032854967345/ > /dev/null 2>&1 && sleep 0' 18911 1727096294.16485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096294.16488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.16491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096294.16493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.16539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096294.16555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.16697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.18725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096294.18755: stderr chunk (state=3): >>><<< 18911 1727096294.18765: stdout chunk (state=3): >>><<< 18911 1727096294.19298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096294.19301: handler run complete 18911 1727096294.19707: variable 'ansible_facts' from source: unknown 18911 1727096294.24166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096294.25906: variable 'ansible_facts' from source: unknown 18911 1727096294.26244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096294.26938: attempt loop complete, returning result 18911 1727096294.26951: _execute() done 18911 1727096294.27035: dumping result to json 18911 1727096294.27106: done dumping result, returning 18911 1727096294.27579: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-09a7-aae1-00000000027e] 18911 1727096294.27582: sending task result for task 0afff68d-5257-09a7-aae1-00000000027e ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096294.29835: no more pending results, returning what we have 18911 1727096294.29839: results queue empty 18911 1727096294.29840: checking for any_errors_fatal 18911 1727096294.29845: done checking for any_errors_fatal 18911 1727096294.29845: checking for max_fail_percentage 18911 1727096294.29847: done checking for max_fail_percentage 18911 1727096294.29847: checking to see if all hosts have failed and the running result is not ok 18911 1727096294.29848: done checking to see if all hosts have failed 18911 1727096294.29849: getting the remaining hosts for this loop 18911 1727096294.29850: done getting the remaining hosts for this loop 18911 1727096294.29853: getting the next task for host managed_node1 18911 1727096294.29858: done getting next task for host managed_node1 18911 1727096294.29865: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096294.29870: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096294.29879: getting variables 18911 1727096294.29881: in VariableManager get_vars() 18911 1727096294.29921: Calling all_inventory to load vars for managed_node1 18911 1727096294.29924: Calling groups_inventory to load vars for managed_node1 18911 1727096294.29927: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096294.29936: Calling all_plugins_play to load vars for managed_node1 18911 1727096294.29939: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096294.29941: Calling groups_plugins_play to load vars for managed_node1 18911 1727096294.30481: done sending task result for task 0afff68d-5257-09a7-aae1-00000000027e 18911 1727096294.30485: WORKER PROCESS EXITING 18911 1727096294.31493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096294.32970: done with get_vars() 18911 1727096294.32986: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:58:14 -0400 (0:00:02.300) 0:00:13.445 ****** 18911 1727096294.33086: entering _queue_task() for managed_node1/package_facts 18911 1727096294.33088: Creating lock for package_facts 18911 1727096294.33896: worker is 1 (out of 1 available) 18911 1727096294.33910: exiting _queue_task() for managed_node1/package_facts 18911 1727096294.33921: done queuing things up, now waiting for results queue to drain 18911 1727096294.33923: waiting for pending results... 18911 1727096294.34444: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096294.34551: in run() - task 0afff68d-5257-09a7-aae1-00000000027f 18911 1727096294.34891: variable 'ansible_search_path' from source: unknown 18911 1727096294.34982: variable 'ansible_search_path' from source: unknown 18911 1727096294.35024: calling self._execute() 18911 1727096294.35386: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096294.35398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096294.35536: variable 'omit' from source: magic vars 18911 1727096294.36582: variable 'ansible_distribution_major_version' from source: facts 18911 1727096294.36636: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096294.36685: variable 'omit' from source: magic vars 18911 1727096294.36858: variable 'omit' from source: magic vars 18911 1727096294.36922: variable 'omit' from source: magic vars 18911 1727096294.37090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096294.37402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096294.37504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096294.37508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096294.37511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096294.37513: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096294.37518: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096294.37522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096294.37755: Set connection var ansible_shell_executable to /bin/sh 18911 1727096294.37826: Set connection var ansible_timeout to 10 18911 1727096294.37830: Set connection var ansible_shell_type to sh 18911 1727096294.37842: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096294.37889: Set connection var ansible_pipelining to False 18911 1727096294.37960: Set connection var ansible_connection to ssh 18911 1727096294.37988: variable 'ansible_shell_executable' from source: unknown 18911 1727096294.37995: variable 'ansible_connection' from source: unknown 18911 1727096294.38001: variable 'ansible_module_compression' from source: unknown 18911 1727096294.38028: variable 'ansible_shell_type' from source: unknown 18911 1727096294.38031: variable 'ansible_shell_executable' from source: unknown 18911 1727096294.38036: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096294.38043: variable 'ansible_pipelining' from source: unknown 18911 1727096294.38046: variable 'ansible_timeout' from source: unknown 18911 1727096294.38100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096294.38643: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096294.38655: variable 'omit' from source: magic vars 18911 1727096294.38664: starting attempt loop 18911 1727096294.38669: running the handler 18911 1727096294.38688: _low_level_execute_command(): starting 18911 1727096294.38799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096294.40399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096294.40587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.40638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096294.40746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.40830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.43135: stdout chunk (state=3): >>>/root <<< 18911 1727096294.43178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096294.43181: stdout chunk (state=3): >>><<< 18911 1727096294.43184: stderr chunk (state=3): >>><<< 18911 1727096294.43187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096294.43190: _low_level_execute_command(): starting 18911 1727096294.43196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276 `" && echo ansible-tmp-1727096294.4317536-19596-94770913434276="` echo /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276 `" ) && sleep 0' 18911 1727096294.45043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096294.45211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.45395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096294.45398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.45497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.47518: stdout chunk (state=3): >>>ansible-tmp-1727096294.4317536-19596-94770913434276=/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276 <<< 18911 1727096294.47721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096294.47725: stderr chunk (state=3): >>><<< 18911 1727096294.47727: stdout chunk (state=3): >>><<< 18911 1727096294.47750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096294.4317536-19596-94770913434276=/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096294.47812: variable 'ansible_module_compression' from source: unknown 18911 1727096294.47862: ANSIBALLZ: Using lock for package_facts 18911 1727096294.47921: ANSIBALLZ: Acquiring lock 18911 1727096294.47925: ANSIBALLZ: Lock acquired: 140481131633200 18911 1727096294.47927: ANSIBALLZ: Creating module 18911 1727096294.88420: ANSIBALLZ: Writing module into payload 18911 1727096294.88563: ANSIBALLZ: Writing module 18911 1727096294.88592: ANSIBALLZ: Renaming module 18911 1727096294.88596: ANSIBALLZ: Done creating module 18911 1727096294.88619: variable 'ansible_facts' from source: unknown 18911 1727096294.88811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py 18911 1727096294.89186: Sending initial data 18911 1727096294.89190: Sent initial data (161 bytes) 18911 1727096294.89559: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096294.89611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.89715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096294.89737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096294.89747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.89860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.91556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096294.91620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096294.91715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmps1kmc3tf /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py <<< 18911 1727096294.91718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py" <<< 18911 1727096294.91770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmps1kmc3tf" to remote "/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py" <<< 18911 1727096294.93713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096294.93879: stderr chunk (state=3): >>><<< 18911 1727096294.93882: stdout chunk (state=3): >>><<< 18911 1727096294.93885: done transferring module to remote 18911 1727096294.93887: _low_level_execute_command(): starting 18911 1727096294.93889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/ /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py && sleep 0' 18911 1727096294.94433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096294.94483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.94575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096294.94599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.94708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096294.96778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096294.96783: stdout chunk (state=3): >>><<< 18911 1727096294.96789: stderr chunk (state=3): >>><<< 18911 1727096294.96793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096294.96796: _low_level_execute_command(): starting 18911 1727096294.96799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/AnsiballZ_package_facts.py && sleep 0' 18911 1727096294.97493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096294.97566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096294.97585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096294.97599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096294.97620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096294.97798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096295.42120: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release"<<< 18911 1727096295.42198: stdout chunk (state=3): >>>: "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source"<<< 18911 1727096295.42217: stdout chunk (state=3): >>>: "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue",<<< 18911 1727096295.42311: stdout chunk (state=3): >>> "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "<<< 18911 1727096295.42318: stdout chunk (state=3): >>>epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.1<<< 18911 1727096295.42321: stdout chunk (state=3): >>>9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18911 1727096295.44128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096295.44131: stdout chunk (state=3): >>><<< 18911 1727096295.44133: stderr chunk (state=3): >>><<< 18911 1727096295.44292: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096295.47108: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096295.47172: _low_level_execute_command(): starting 18911 1727096295.47176: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096294.4317536-19596-94770913434276/ > /dev/null 2>&1 && sleep 0' 18911 1727096295.47780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096295.47792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096295.47804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096295.47844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096295.47929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096295.47960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096295.48069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096295.50124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096295.50128: stdout chunk (state=3): >>><<< 18911 1727096295.50131: stderr chunk (state=3): >>><<< 18911 1727096295.50435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096295.50439: handler run complete 18911 1727096295.59604: variable 'ansible_facts' from source: unknown 18911 1727096295.60049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096295.62810: variable 'ansible_facts' from source: unknown 18911 1727096295.63711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096295.64601: attempt loop complete, returning result 18911 1727096295.64621: _execute() done 18911 1727096295.64624: dumping result to json 18911 1727096295.65072: done dumping result, returning 18911 1727096295.65076: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-09a7-aae1-00000000027f] 18911 1727096295.65078: sending task result for task 0afff68d-5257-09a7-aae1-00000000027f ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096295.69869: no more pending results, returning what we have 18911 1727096295.69872: results queue empty 18911 1727096295.69872: checking for any_errors_fatal 18911 1727096295.69997: done checking for any_errors_fatal 18911 1727096295.69999: checking for max_fail_percentage 18911 1727096295.70001: done checking for max_fail_percentage 18911 1727096295.70001: checking to see if all hosts have failed and the running result is not ok 18911 1727096295.70002: done checking to see if all hosts have failed 18911 1727096295.70003: getting the remaining hosts for this loop 18911 1727096295.70011: done getting the remaining hosts for this loop 18911 1727096295.70014: getting the next task for host managed_node1 18911 1727096295.70020: done getting next task for host managed_node1 18911 1727096295.70023: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096295.70025: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096295.70056: done sending task result for task 0afff68d-5257-09a7-aae1-00000000027f 18911 1727096295.70059: WORKER PROCESS EXITING 18911 1727096295.70066: getting variables 18911 1727096295.70069: in VariableManager get_vars() 18911 1727096295.70170: Calling all_inventory to load vars for managed_node1 18911 1727096295.70473: Calling groups_inventory to load vars for managed_node1 18911 1727096295.70475: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096295.70485: Calling all_plugins_play to load vars for managed_node1 18911 1727096295.70487: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096295.70490: Calling groups_plugins_play to load vars for managed_node1 18911 1727096295.73280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096295.78096: done with get_vars() 18911 1727096295.78127: done getting variables 18911 1727096295.78270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:58:15 -0400 (0:00:01.452) 0:00:14.897 ****** 18911 1727096295.78328: entering _queue_task() for managed_node1/debug 18911 1727096295.79254: worker is 1 (out of 1 available) 18911 1727096295.79288: exiting _queue_task() for managed_node1/debug 18911 1727096295.79329: done queuing things up, now waiting for results queue to drain 18911 1727096295.79331: waiting for pending results... 18911 1727096295.79787: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096295.80120: in run() - task 0afff68d-5257-09a7-aae1-00000000001a 18911 1727096295.80166: variable 'ansible_search_path' from source: unknown 18911 1727096295.80176: variable 'ansible_search_path' from source: unknown 18911 1727096295.80287: calling self._execute() 18911 1727096295.80433: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096295.80438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096295.80449: variable 'omit' from source: magic vars 18911 1727096295.81416: variable 'ansible_distribution_major_version' from source: facts 18911 1727096295.81427: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096295.81434: variable 'omit' from source: magic vars 18911 1727096295.81629: variable 'omit' from source: magic vars 18911 1727096295.81933: variable 'network_provider' from source: set_fact 18911 1727096295.81937: variable 'omit' from source: magic vars 18911 1727096295.81987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096295.82174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096295.82177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096295.82202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096295.82278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096295.82361: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096295.82365: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096295.82369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096295.82670: Set connection var ansible_shell_executable to /bin/sh 18911 1727096295.82680: Set connection var ansible_timeout to 10 18911 1727096295.82743: Set connection var ansible_shell_type to sh 18911 1727096295.82747: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096295.82749: Set connection var ansible_pipelining to False 18911 1727096295.82752: Set connection var ansible_connection to ssh 18911 1727096295.82754: variable 'ansible_shell_executable' from source: unknown 18911 1727096295.82836: variable 'ansible_connection' from source: unknown 18911 1727096295.82839: variable 'ansible_module_compression' from source: unknown 18911 1727096295.82841: variable 'ansible_shell_type' from source: unknown 18911 1727096295.82843: variable 'ansible_shell_executable' from source: unknown 18911 1727096295.82845: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096295.82847: variable 'ansible_pipelining' from source: unknown 18911 1727096295.82849: variable 'ansible_timeout' from source: unknown 18911 1727096295.82851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096295.83138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096295.83142: variable 'omit' from source: magic vars 18911 1727096295.83184: starting attempt loop 18911 1727096295.83187: running the handler 18911 1727096295.83337: handler run complete 18911 1727096295.83344: attempt loop complete, returning result 18911 1727096295.83348: _execute() done 18911 1727096295.83350: dumping result to json 18911 1727096295.83352: done dumping result, returning 18911 1727096295.83355: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-09a7-aae1-00000000001a] 18911 1727096295.83357: sending task result for task 0afff68d-5257-09a7-aae1-00000000001a 18911 1727096295.83789: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001a 18911 1727096295.83793: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 18911 1727096295.83872: no more pending results, returning what we have 18911 1727096295.83876: results queue empty 18911 1727096295.83877: checking for any_errors_fatal 18911 1727096295.83887: done checking for any_errors_fatal 18911 1727096295.83888: checking for max_fail_percentage 18911 1727096295.83890: done checking for max_fail_percentage 18911 1727096295.83891: checking to see if all hosts have failed and the running result is not ok 18911 1727096295.83891: done checking to see if all hosts have failed 18911 1727096295.83892: getting the remaining hosts for this loop 18911 1727096295.83893: done getting the remaining hosts for this loop 18911 1727096295.83896: getting the next task for host managed_node1 18911 1727096295.83902: done getting next task for host managed_node1 18911 1727096295.83906: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096295.83908: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096295.84107: getting variables 18911 1727096295.84109: in VariableManager get_vars() 18911 1727096295.84266: Calling all_inventory to load vars for managed_node1 18911 1727096295.84316: Calling groups_inventory to load vars for managed_node1 18911 1727096295.84319: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096295.84329: Calling all_plugins_play to load vars for managed_node1 18911 1727096295.84331: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096295.84366: Calling groups_plugins_play to load vars for managed_node1 18911 1727096295.88047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096295.93033: done with get_vars() 18911 1727096295.93225: done getting variables 18911 1727096295.93622: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:58:15 -0400 (0:00:00.153) 0:00:15.050 ****** 18911 1727096295.93651: entering _queue_task() for managed_node1/fail 18911 1727096295.95731: worker is 1 (out of 1 available) 18911 1727096295.95893: exiting _queue_task() for managed_node1/fail 18911 1727096295.95906: done queuing things up, now waiting for results queue to drain 18911 1727096295.95908: waiting for pending results... 18911 1727096295.96365: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096295.96518: in run() - task 0afff68d-5257-09a7-aae1-00000000001b 18911 1727096295.96531: variable 'ansible_search_path' from source: unknown 18911 1727096295.96535: variable 'ansible_search_path' from source: unknown 18911 1727096295.96695: calling self._execute() 18911 1727096295.96906: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096295.96912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096295.96923: variable 'omit' from source: magic vars 18911 1727096295.97875: variable 'ansible_distribution_major_version' from source: facts 18911 1727096295.97887: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096295.98171: variable 'network_state' from source: role '' defaults 18911 1727096295.98293: Evaluated conditional (network_state != {}): False 18911 1727096295.98298: when evaluation is False, skipping this task 18911 1727096295.98301: _execute() done 18911 1727096295.98304: dumping result to json 18911 1727096295.98309: done dumping result, returning 18911 1727096295.98315: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-09a7-aae1-00000000001b] 18911 1727096295.98318: sending task result for task 0afff68d-5257-09a7-aae1-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096295.98571: no more pending results, returning what we have 18911 1727096295.98576: results queue empty 18911 1727096295.98577: checking for any_errors_fatal 18911 1727096295.98586: done checking for any_errors_fatal 18911 1727096295.98587: checking for max_fail_percentage 18911 1727096295.98590: done checking for max_fail_percentage 18911 1727096295.98591: checking to see if all hosts have failed and the running result is not ok 18911 1727096295.98591: done checking to see if all hosts have failed 18911 1727096295.98594: getting the remaining hosts for this loop 18911 1727096295.98595: done getting the remaining hosts for this loop 18911 1727096295.98599: getting the next task for host managed_node1 18911 1727096295.98694: done getting next task for host managed_node1 18911 1727096295.98711: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096295.98751: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096295.98807: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001b 18911 1727096295.98811: WORKER PROCESS EXITING 18911 1727096295.98918: getting variables 18911 1727096295.98921: in VariableManager get_vars() 18911 1727096295.99197: Calling all_inventory to load vars for managed_node1 18911 1727096295.99201: Calling groups_inventory to load vars for managed_node1 18911 1727096295.99205: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096295.99218: Calling all_plugins_play to load vars for managed_node1 18911 1727096295.99224: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096295.99229: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.02191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.05553: done with get_vars() 18911 1727096296.05721: done getting variables 18911 1727096296.05786: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:58:16 -0400 (0:00:00.121) 0:00:15.172 ****** 18911 1727096296.05818: entering _queue_task() for managed_node1/fail 18911 1727096296.06538: worker is 1 (out of 1 available) 18911 1727096296.06553: exiting _queue_task() for managed_node1/fail 18911 1727096296.06564: done queuing things up, now waiting for results queue to drain 18911 1727096296.06566: waiting for pending results... 18911 1727096296.07584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096296.07916: in run() - task 0afff68d-5257-09a7-aae1-00000000001c 18911 1727096296.08073: variable 'ansible_search_path' from source: unknown 18911 1727096296.08077: variable 'ansible_search_path' from source: unknown 18911 1727096296.08080: calling self._execute() 18911 1727096296.08182: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.08189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.08199: variable 'omit' from source: magic vars 18911 1727096296.09172: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.09177: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.09228: variable 'network_state' from source: role '' defaults 18911 1727096296.09240: Evaluated conditional (network_state != {}): False 18911 1727096296.09244: when evaluation is False, skipping this task 18911 1727096296.09247: _execute() done 18911 1727096296.09249: dumping result to json 18911 1727096296.09481: done dumping result, returning 18911 1727096296.09488: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-09a7-aae1-00000000001c] 18911 1727096296.09494: sending task result for task 0afff68d-5257-09a7-aae1-00000000001c 18911 1727096296.09596: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001c 18911 1727096296.09599: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096296.09656: no more pending results, returning what we have 18911 1727096296.09659: results queue empty 18911 1727096296.09660: checking for any_errors_fatal 18911 1727096296.09672: done checking for any_errors_fatal 18911 1727096296.09673: checking for max_fail_percentage 18911 1727096296.09675: done checking for max_fail_percentage 18911 1727096296.09677: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.09678: done checking to see if all hosts have failed 18911 1727096296.09679: getting the remaining hosts for this loop 18911 1727096296.09681: done getting the remaining hosts for this loop 18911 1727096296.09684: getting the next task for host managed_node1 18911 1727096296.09691: done getting next task for host managed_node1 18911 1727096296.09695: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096296.09697: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.09713: getting variables 18911 1727096296.09715: in VariableManager get_vars() 18911 1727096296.09755: Calling all_inventory to load vars for managed_node1 18911 1727096296.09758: Calling groups_inventory to load vars for managed_node1 18911 1727096296.09761: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.09777: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.09780: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.09784: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.13205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.16385: done with get_vars() 18911 1727096296.16415: done getting variables 18911 1727096296.16642: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:58:16 -0400 (0:00:00.108) 0:00:15.281 ****** 18911 1727096296.16721: entering _queue_task() for managed_node1/fail 18911 1727096296.17149: worker is 1 (out of 1 available) 18911 1727096296.17164: exiting _queue_task() for managed_node1/fail 18911 1727096296.17177: done queuing things up, now waiting for results queue to drain 18911 1727096296.17179: waiting for pending results... 18911 1727096296.17457: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096296.17516: in run() - task 0afff68d-5257-09a7-aae1-00000000001d 18911 1727096296.17533: variable 'ansible_search_path' from source: unknown 18911 1727096296.17541: variable 'ansible_search_path' from source: unknown 18911 1727096296.17587: calling self._execute() 18911 1727096296.17771: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.17775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.17778: variable 'omit' from source: magic vars 18911 1727096296.18165: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.18186: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.18542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096296.23383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096296.23675: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096296.23679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096296.23682: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096296.23684: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096296.23853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.24009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.24039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.24125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.24237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.24390: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.24410: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18911 1727096296.24623: variable 'ansible_distribution' from source: facts 18911 1727096296.24872: variable '__network_rh_distros' from source: role '' defaults 18911 1727096296.24875: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18911 1727096296.25237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.25328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.25358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.25451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.25536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.25592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.25656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.25710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.25759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.25781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.25827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.25975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.26174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.26177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.26179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.26853: variable 'network_connections' from source: play vars 18911 1727096296.26874: variable 'interface' from source: set_fact 18911 1727096296.27175: variable 'interface' from source: set_fact 18911 1727096296.27179: variable 'interface' from source: set_fact 18911 1727096296.27181: variable 'interface' from source: set_fact 18911 1727096296.27183: variable 'network_state' from source: role '' defaults 18911 1727096296.27248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096296.27698: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096296.27759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096296.27862: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096296.27898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096296.28064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096296.28102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096296.28130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.28187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096296.28306: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18911 1727096296.28314: when evaluation is False, skipping this task 18911 1727096296.28321: _execute() done 18911 1727096296.28328: dumping result to json 18911 1727096296.28334: done dumping result, returning 18911 1727096296.28345: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-09a7-aae1-00000000001d] 18911 1727096296.28382: sending task result for task 0afff68d-5257-09a7-aae1-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18911 1727096296.28627: no more pending results, returning what we have 18911 1727096296.28631: results queue empty 18911 1727096296.28632: checking for any_errors_fatal 18911 1727096296.28640: done checking for any_errors_fatal 18911 1727096296.28641: checking for max_fail_percentage 18911 1727096296.28643: done checking for max_fail_percentage 18911 1727096296.28645: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.28646: done checking to see if all hosts have failed 18911 1727096296.28646: getting the remaining hosts for this loop 18911 1727096296.28648: done getting the remaining hosts for this loop 18911 1727096296.28651: getting the next task for host managed_node1 18911 1727096296.28659: done getting next task for host managed_node1 18911 1727096296.28663: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096296.28664: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.28680: getting variables 18911 1727096296.28682: in VariableManager get_vars() 18911 1727096296.28721: Calling all_inventory to load vars for managed_node1 18911 1727096296.28723: Calling groups_inventory to load vars for managed_node1 18911 1727096296.28726: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.28737: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.28740: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.28743: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.29548: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001d 18911 1727096296.29552: WORKER PROCESS EXITING 18911 1727096296.30632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.32262: done with get_vars() 18911 1727096296.32285: done getting variables 18911 1727096296.32381: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:58:16 -0400 (0:00:00.156) 0:00:15.438 ****** 18911 1727096296.32410: entering _queue_task() for managed_node1/dnf 18911 1727096296.32710: worker is 1 (out of 1 available) 18911 1727096296.32722: exiting _queue_task() for managed_node1/dnf 18911 1727096296.32734: done queuing things up, now waiting for results queue to drain 18911 1727096296.32736: waiting for pending results... 18911 1727096296.33005: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096296.33112: in run() - task 0afff68d-5257-09a7-aae1-00000000001e 18911 1727096296.33130: variable 'ansible_search_path' from source: unknown 18911 1727096296.33139: variable 'ansible_search_path' from source: unknown 18911 1727096296.33180: calling self._execute() 18911 1727096296.33275: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.33286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.33307: variable 'omit' from source: magic vars 18911 1727096296.33674: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.33691: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.33897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096296.36082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096296.36160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096296.36202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096296.36239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096296.36276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096296.36355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.36395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.36424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.36469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.36492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.36607: variable 'ansible_distribution' from source: facts 18911 1727096296.36618: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.36638: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18911 1727096296.36750: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096296.36883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.36916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.36946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.36990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.37009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.37073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.37076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.37098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.37144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.37164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.37233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.37239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.37266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.37474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.37477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.37773: variable 'network_connections' from source: play vars 18911 1727096296.37777: variable 'interface' from source: set_fact 18911 1727096296.37779: variable 'interface' from source: set_fact 18911 1727096296.37781: variable 'interface' from source: set_fact 18911 1727096296.38028: variable 'interface' from source: set_fact 18911 1727096296.38214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096296.38541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096296.38583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096296.38688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096296.38782: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096296.38829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096296.39174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096296.39185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.39188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096296.39219: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096296.39590: variable 'network_connections' from source: play vars 18911 1727096296.39680: variable 'interface' from source: set_fact 18911 1727096296.39748: variable 'interface' from source: set_fact 18911 1727096296.39835: variable 'interface' from source: set_fact 18911 1727096296.39900: variable 'interface' from source: set_fact 18911 1727096296.40078: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096296.40086: when evaluation is False, skipping this task 18911 1727096296.40092: _execute() done 18911 1727096296.40098: dumping result to json 18911 1727096296.40104: done dumping result, returning 18911 1727096296.40115: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-00000000001e] 18911 1727096296.40122: sending task result for task 0afff68d-5257-09a7-aae1-00000000001e 18911 1727096296.40232: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096296.40308: no more pending results, returning what we have 18911 1727096296.40311: results queue empty 18911 1727096296.40312: checking for any_errors_fatal 18911 1727096296.40318: done checking for any_errors_fatal 18911 1727096296.40319: checking for max_fail_percentage 18911 1727096296.40321: done checking for max_fail_percentage 18911 1727096296.40322: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.40322: done checking to see if all hosts have failed 18911 1727096296.40323: getting the remaining hosts for this loop 18911 1727096296.40324: done getting the remaining hosts for this loop 18911 1727096296.40328: getting the next task for host managed_node1 18911 1727096296.40335: done getting next task for host managed_node1 18911 1727096296.40339: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096296.40340: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.40353: getting variables 18911 1727096296.40354: in VariableManager get_vars() 18911 1727096296.40393: Calling all_inventory to load vars for managed_node1 18911 1727096296.40396: Calling groups_inventory to load vars for managed_node1 18911 1727096296.40398: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.40409: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.40412: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.40415: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.41194: WORKER PROCESS EXITING 18911 1727096296.42051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.43596: done with get_vars() 18911 1727096296.43621: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096296.43693: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:58:16 -0400 (0:00:00.113) 0:00:15.551 ****** 18911 1727096296.43723: entering _queue_task() for managed_node1/yum 18911 1727096296.43724: Creating lock for yum 18911 1727096296.44055: worker is 1 (out of 1 available) 18911 1727096296.44066: exiting _queue_task() for managed_node1/yum 18911 1727096296.44279: done queuing things up, now waiting for results queue to drain 18911 1727096296.44281: waiting for pending results... 18911 1727096296.44347: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096296.44454: in run() - task 0afff68d-5257-09a7-aae1-00000000001f 18911 1727096296.44476: variable 'ansible_search_path' from source: unknown 18911 1727096296.44485: variable 'ansible_search_path' from source: unknown 18911 1727096296.44530: calling self._execute() 18911 1727096296.44625: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.44636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.44653: variable 'omit' from source: magic vars 18911 1727096296.45019: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.45037: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.45216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096296.47372: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096296.47454: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096296.47497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096296.47535: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096296.47572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096296.47658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.47695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.47724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.47775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.47796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.47902: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.47925: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18911 1727096296.47933: when evaluation is False, skipping this task 18911 1727096296.47941: _execute() done 18911 1727096296.47947: dumping result to json 18911 1727096296.47953: done dumping result, returning 18911 1727096296.47963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-00000000001f] 18911 1727096296.47977: sending task result for task 0afff68d-5257-09a7-aae1-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18911 1727096296.48238: no more pending results, returning what we have 18911 1727096296.48242: results queue empty 18911 1727096296.48243: checking for any_errors_fatal 18911 1727096296.48249: done checking for any_errors_fatal 18911 1727096296.48250: checking for max_fail_percentage 18911 1727096296.48253: done checking for max_fail_percentage 18911 1727096296.48254: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.48255: done checking to see if all hosts have failed 18911 1727096296.48255: getting the remaining hosts for this loop 18911 1727096296.48257: done getting the remaining hosts for this loop 18911 1727096296.48260: getting the next task for host managed_node1 18911 1727096296.48270: done getting next task for host managed_node1 18911 1727096296.48274: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096296.48276: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.48288: getting variables 18911 1727096296.48290: in VariableManager get_vars() 18911 1727096296.48330: Calling all_inventory to load vars for managed_node1 18911 1727096296.48334: Calling groups_inventory to load vars for managed_node1 18911 1727096296.48336: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.48348: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.48351: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.48354: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.48882: done sending task result for task 0afff68d-5257-09a7-aae1-00000000001f 18911 1727096296.48885: WORKER PROCESS EXITING 18911 1727096296.54218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.56396: done with get_vars() 18911 1727096296.56426: done getting variables 18911 1727096296.56477: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:58:16 -0400 (0:00:00.127) 0:00:15.679 ****** 18911 1727096296.56505: entering _queue_task() for managed_node1/fail 18911 1727096296.57034: worker is 1 (out of 1 available) 18911 1727096296.57049: exiting _queue_task() for managed_node1/fail 18911 1727096296.57063: done queuing things up, now waiting for results queue to drain 18911 1727096296.57064: waiting for pending results... 18911 1727096296.57613: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096296.57778: in run() - task 0afff68d-5257-09a7-aae1-000000000020 18911 1727096296.57782: variable 'ansible_search_path' from source: unknown 18911 1727096296.57785: variable 'ansible_search_path' from source: unknown 18911 1727096296.57788: calling self._execute() 18911 1727096296.57886: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.57892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.57911: variable 'omit' from source: magic vars 18911 1727096296.58339: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.58351: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.58488: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096296.58709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096296.61372: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096296.61453: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096296.61497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096296.61538: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096296.61569: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096296.61654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.61734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.61738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.61765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.61789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.61841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.62073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.62076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.62078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.62080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.62082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.62084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.62086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.62087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.62119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.62329: variable 'network_connections' from source: play vars 18911 1727096296.62348: variable 'interface' from source: set_fact 18911 1727096296.62430: variable 'interface' from source: set_fact 18911 1727096296.62445: variable 'interface' from source: set_fact 18911 1727096296.62512: variable 'interface' from source: set_fact 18911 1727096296.62593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096296.62788: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096296.62829: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096296.62872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096296.62906: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096296.62953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096296.62986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096296.63018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.63050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096296.63120: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096296.63373: variable 'network_connections' from source: play vars 18911 1727096296.63406: variable 'interface' from source: set_fact 18911 1727096296.63451: variable 'interface' from source: set_fact 18911 1727096296.63463: variable 'interface' from source: set_fact 18911 1727096296.63531: variable 'interface' from source: set_fact 18911 1727096296.63624: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096296.63628: when evaluation is False, skipping this task 18911 1727096296.63630: _execute() done 18911 1727096296.63632: dumping result to json 18911 1727096296.63634: done dumping result, returning 18911 1727096296.63636: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000020] 18911 1727096296.63647: sending task result for task 0afff68d-5257-09a7-aae1-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096296.63781: no more pending results, returning what we have 18911 1727096296.63784: results queue empty 18911 1727096296.63785: checking for any_errors_fatal 18911 1727096296.63794: done checking for any_errors_fatal 18911 1727096296.63795: checking for max_fail_percentage 18911 1727096296.63797: done checking for max_fail_percentage 18911 1727096296.63798: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.63798: done checking to see if all hosts have failed 18911 1727096296.63799: getting the remaining hosts for this loop 18911 1727096296.63800: done getting the remaining hosts for this loop 18911 1727096296.63804: getting the next task for host managed_node1 18911 1727096296.63811: done getting next task for host managed_node1 18911 1727096296.63815: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18911 1727096296.63817: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.63831: getting variables 18911 1727096296.63833: in VariableManager get_vars() 18911 1727096296.63875: Calling all_inventory to load vars for managed_node1 18911 1727096296.63878: Calling groups_inventory to load vars for managed_node1 18911 1727096296.63880: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.63892: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.63895: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.63898: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.64581: done sending task result for task 0afff68d-5257-09a7-aae1-000000000020 18911 1727096296.64584: WORKER PROCESS EXITING 18911 1727096296.65614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.67539: done with get_vars() 18911 1727096296.67574: done getting variables 18911 1727096296.67636: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:58:16 -0400 (0:00:00.111) 0:00:15.791 ****** 18911 1727096296.67691: entering _queue_task() for managed_node1/package 18911 1727096296.68092: worker is 1 (out of 1 available) 18911 1727096296.68104: exiting _queue_task() for managed_node1/package 18911 1727096296.68117: done queuing things up, now waiting for results queue to drain 18911 1727096296.68120: waiting for pending results... 18911 1727096296.68440: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18911 1727096296.68680: in run() - task 0afff68d-5257-09a7-aae1-000000000021 18911 1727096296.68684: variable 'ansible_search_path' from source: unknown 18911 1727096296.68728: variable 'ansible_search_path' from source: unknown 18911 1727096296.68773: calling self._execute() 18911 1727096296.68903: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.68913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.68956: variable 'omit' from source: magic vars 18911 1727096296.69530: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.69555: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.69812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096296.70255: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096296.70272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096296.70331: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096296.70399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096296.70542: variable 'network_packages' from source: role '' defaults 18911 1727096296.70702: variable '__network_provider_setup' from source: role '' defaults 18911 1727096296.70741: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096296.70993: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096296.70996: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096296.71002: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096296.71211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096296.73996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096296.74050: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096296.74088: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096296.74118: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096296.74142: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096296.74219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.74246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.74271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.74313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.74324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.74367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.74390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.74414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.74448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.74460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.74753: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096296.74778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.74815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.74843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.74882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.74897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.74987: variable 'ansible_python' from source: facts 18911 1727096296.75012: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096296.75119: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096296.75177: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096296.75305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.75322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.75347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.75389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.75412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.75447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096296.75490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096296.75496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.75573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096296.75577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096296.75694: variable 'network_connections' from source: play vars 18911 1727096296.75703: variable 'interface' from source: set_fact 18911 1727096296.75851: variable 'interface' from source: set_fact 18911 1727096296.75854: variable 'interface' from source: set_fact 18911 1727096296.75908: variable 'interface' from source: set_fact 18911 1727096296.75982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096296.76009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096296.76376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096296.76379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096296.76382: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096296.76400: variable 'network_connections' from source: play vars 18911 1727096296.76403: variable 'interface' from source: set_fact 18911 1727096296.76501: variable 'interface' from source: set_fact 18911 1727096296.76510: variable 'interface' from source: set_fact 18911 1727096296.76629: variable 'interface' from source: set_fact 18911 1727096296.76653: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096296.76737: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096296.77032: variable 'network_connections' from source: play vars 18911 1727096296.77036: variable 'interface' from source: set_fact 18911 1727096296.77102: variable 'interface' from source: set_fact 18911 1727096296.77108: variable 'interface' from source: set_fact 18911 1727096296.77176: variable 'interface' from source: set_fact 18911 1727096296.77199: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096296.77281: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096296.77563: variable 'network_connections' from source: play vars 18911 1727096296.77569: variable 'interface' from source: set_fact 18911 1727096296.77630: variable 'interface' from source: set_fact 18911 1727096296.77633: variable 'interface' from source: set_fact 18911 1727096296.77955: variable 'interface' from source: set_fact 18911 1727096296.77959: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096296.77963: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096296.77966: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096296.77969: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096296.78295: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096296.78936: variable 'network_connections' from source: play vars 18911 1727096296.78940: variable 'interface' from source: set_fact 18911 1727096296.79027: variable 'interface' from source: set_fact 18911 1727096296.79043: variable 'interface' from source: set_fact 18911 1727096296.79128: variable 'interface' from source: set_fact 18911 1727096296.79134: variable 'ansible_distribution' from source: facts 18911 1727096296.79138: variable '__network_rh_distros' from source: role '' defaults 18911 1727096296.79151: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.79178: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096296.79423: variable 'ansible_distribution' from source: facts 18911 1727096296.79426: variable '__network_rh_distros' from source: role '' defaults 18911 1727096296.79432: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.79453: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096296.79650: variable 'ansible_distribution' from source: facts 18911 1727096296.79653: variable '__network_rh_distros' from source: role '' defaults 18911 1727096296.79659: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.79701: variable 'network_provider' from source: set_fact 18911 1727096296.79719: variable 'ansible_facts' from source: unknown 18911 1727096296.80784: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18911 1727096296.80791: when evaluation is False, skipping this task 18911 1727096296.80797: _execute() done 18911 1727096296.80799: dumping result to json 18911 1727096296.80801: done dumping result, returning 18911 1727096296.80804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-09a7-aae1-000000000021] 18911 1727096296.80829: sending task result for task 0afff68d-5257-09a7-aae1-000000000021 18911 1727096296.81023: done sending task result for task 0afff68d-5257-09a7-aae1-000000000021 18911 1727096296.81027: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18911 1727096296.81085: no more pending results, returning what we have 18911 1727096296.81091: results queue empty 18911 1727096296.81092: checking for any_errors_fatal 18911 1727096296.81101: done checking for any_errors_fatal 18911 1727096296.81103: checking for max_fail_percentage 18911 1727096296.81105: done checking for max_fail_percentage 18911 1727096296.81106: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.81107: done checking to see if all hosts have failed 18911 1727096296.81107: getting the remaining hosts for this loop 18911 1727096296.81109: done getting the remaining hosts for this loop 18911 1727096296.81115: getting the next task for host managed_node1 18911 1727096296.81125: done getting next task for host managed_node1 18911 1727096296.81130: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096296.81132: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.81146: getting variables 18911 1727096296.81148: in VariableManager get_vars() 18911 1727096296.81400: Calling all_inventory to load vars for managed_node1 18911 1727096296.81404: Calling groups_inventory to load vars for managed_node1 18911 1727096296.81406: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.81461: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.81470: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.81475: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.83349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.85122: done with get_vars() 18911 1727096296.85148: done getting variables 18911 1727096296.85210: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:58:16 -0400 (0:00:00.175) 0:00:15.966 ****** 18911 1727096296.85241: entering _queue_task() for managed_node1/package 18911 1727096296.85608: worker is 1 (out of 1 available) 18911 1727096296.85622: exiting _queue_task() for managed_node1/package 18911 1727096296.85636: done queuing things up, now waiting for results queue to drain 18911 1727096296.85638: waiting for pending results... 18911 1727096296.86038: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096296.86043: in run() - task 0afff68d-5257-09a7-aae1-000000000022 18911 1727096296.86047: variable 'ansible_search_path' from source: unknown 18911 1727096296.86050: variable 'ansible_search_path' from source: unknown 18911 1727096296.86172: calling self._execute() 18911 1727096296.86190: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.86195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.86214: variable 'omit' from source: magic vars 18911 1727096296.86643: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.86654: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.86783: variable 'network_state' from source: role '' defaults 18911 1727096296.86791: Evaluated conditional (network_state != {}): False 18911 1727096296.86795: when evaluation is False, skipping this task 18911 1727096296.86797: _execute() done 18911 1727096296.86800: dumping result to json 18911 1727096296.86802: done dumping result, returning 18911 1727096296.86891: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000022] 18911 1727096296.86895: sending task result for task 0afff68d-5257-09a7-aae1-000000000022 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096296.87012: no more pending results, returning what we have 18911 1727096296.87016: results queue empty 18911 1727096296.87017: checking for any_errors_fatal 18911 1727096296.87026: done checking for any_errors_fatal 18911 1727096296.87027: checking for max_fail_percentage 18911 1727096296.87029: done checking for max_fail_percentage 18911 1727096296.87030: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.87031: done checking to see if all hosts have failed 18911 1727096296.87031: getting the remaining hosts for this loop 18911 1727096296.87033: done getting the remaining hosts for this loop 18911 1727096296.87036: getting the next task for host managed_node1 18911 1727096296.87045: done getting next task for host managed_node1 18911 1727096296.87049: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096296.87052: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.87072: done sending task result for task 0afff68d-5257-09a7-aae1-000000000022 18911 1727096296.87074: WORKER PROCESS EXITING 18911 1727096296.87195: getting variables 18911 1727096296.87197: in VariableManager get_vars() 18911 1727096296.87266: Calling all_inventory to load vars for managed_node1 18911 1727096296.87271: Calling groups_inventory to load vars for managed_node1 18911 1727096296.87273: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.87283: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.87286: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.87289: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.88992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.91331: done with get_vars() 18911 1727096296.91356: done getting variables 18911 1727096296.91456: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:58:16 -0400 (0:00:00.062) 0:00:16.029 ****** 18911 1727096296.91495: entering _queue_task() for managed_node1/package 18911 1727096296.91878: worker is 1 (out of 1 available) 18911 1727096296.91891: exiting _queue_task() for managed_node1/package 18911 1727096296.91945: done queuing things up, now waiting for results queue to drain 18911 1727096296.91947: waiting for pending results... 18911 1727096296.92196: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096296.92373: in run() - task 0afff68d-5257-09a7-aae1-000000000023 18911 1727096296.92376: variable 'ansible_search_path' from source: unknown 18911 1727096296.92378: variable 'ansible_search_path' from source: unknown 18911 1727096296.92381: calling self._execute() 18911 1727096296.92470: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.92519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.92575: variable 'omit' from source: magic vars 18911 1727096296.93242: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.93472: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096296.93475: variable 'network_state' from source: role '' defaults 18911 1727096296.93478: Evaluated conditional (network_state != {}): False 18911 1727096296.93480: when evaluation is False, skipping this task 18911 1727096296.93482: _execute() done 18911 1727096296.93485: dumping result to json 18911 1727096296.93487: done dumping result, returning 18911 1727096296.93489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000023] 18911 1727096296.93491: sending task result for task 0afff68d-5257-09a7-aae1-000000000023 18911 1727096296.93976: done sending task result for task 0afff68d-5257-09a7-aae1-000000000023 18911 1727096296.93979: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096296.94033: no more pending results, returning what we have 18911 1727096296.94038: results queue empty 18911 1727096296.94039: checking for any_errors_fatal 18911 1727096296.94046: done checking for any_errors_fatal 18911 1727096296.94046: checking for max_fail_percentage 18911 1727096296.94049: done checking for max_fail_percentage 18911 1727096296.94050: checking to see if all hosts have failed and the running result is not ok 18911 1727096296.94053: done checking to see if all hosts have failed 18911 1727096296.94054: getting the remaining hosts for this loop 18911 1727096296.94056: done getting the remaining hosts for this loop 18911 1727096296.94059: getting the next task for host managed_node1 18911 1727096296.94066: done getting next task for host managed_node1 18911 1727096296.94072: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096296.94075: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096296.94089: getting variables 18911 1727096296.94091: in VariableManager get_vars() 18911 1727096296.94128: Calling all_inventory to load vars for managed_node1 18911 1727096296.94131: Calling groups_inventory to load vars for managed_node1 18911 1727096296.94134: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096296.94145: Calling all_plugins_play to load vars for managed_node1 18911 1727096296.94148: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096296.94151: Calling groups_plugins_play to load vars for managed_node1 18911 1727096296.96196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096296.97886: done with get_vars() 18911 1727096296.97915: done getting variables 18911 1727096296.98014: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:58:16 -0400 (0:00:00.065) 0:00:16.094 ****** 18911 1727096296.98043: entering _queue_task() for managed_node1/service 18911 1727096296.98044: Creating lock for service 18911 1727096296.98388: worker is 1 (out of 1 available) 18911 1727096296.98400: exiting _queue_task() for managed_node1/service 18911 1727096296.98412: done queuing things up, now waiting for results queue to drain 18911 1727096296.98413: waiting for pending results... 18911 1727096296.98790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096296.99131: in run() - task 0afff68d-5257-09a7-aae1-000000000024 18911 1727096296.99148: variable 'ansible_search_path' from source: unknown 18911 1727096296.99152: variable 'ansible_search_path' from source: unknown 18911 1727096296.99196: calling self._execute() 18911 1727096296.99302: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096296.99308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096296.99320: variable 'omit' from source: magic vars 18911 1727096296.99736: variable 'ansible_distribution_major_version' from source: facts 18911 1727096296.99761: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096297.00109: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096297.00308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096297.02607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096297.02873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096297.02876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096297.02878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096297.02880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096297.02883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.02904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.02936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.02983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.03019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.03157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.03188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.03245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.03438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.03442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.03444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.03564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.03597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.03764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.03774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.04082: variable 'network_connections' from source: play vars 18911 1727096297.04273: variable 'interface' from source: set_fact 18911 1727096297.04276: variable 'interface' from source: set_fact 18911 1727096297.04528: variable 'interface' from source: set_fact 18911 1727096297.04531: variable 'interface' from source: set_fact 18911 1727096297.04600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096297.04995: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096297.05039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096297.05105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096297.05211: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096297.05330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096297.05333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096297.05443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.05477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096297.05645: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096297.06605: variable 'network_connections' from source: play vars 18911 1727096297.06608: variable 'interface' from source: set_fact 18911 1727096297.06772: variable 'interface' from source: set_fact 18911 1727096297.06775: variable 'interface' from source: set_fact 18911 1727096297.06777: variable 'interface' from source: set_fact 18911 1727096297.06799: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096297.06802: when evaluation is False, skipping this task 18911 1727096297.06805: _execute() done 18911 1727096297.06807: dumping result to json 18911 1727096297.06809: done dumping result, returning 18911 1727096297.06824: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000024] 18911 1727096297.06834: sending task result for task 0afff68d-5257-09a7-aae1-000000000024 18911 1727096297.07195: done sending task result for task 0afff68d-5257-09a7-aae1-000000000024 18911 1727096297.07198: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096297.07234: no more pending results, returning what we have 18911 1727096297.07238: results queue empty 18911 1727096297.07239: checking for any_errors_fatal 18911 1727096297.07245: done checking for any_errors_fatal 18911 1727096297.07246: checking for max_fail_percentage 18911 1727096297.07248: done checking for max_fail_percentage 18911 1727096297.07248: checking to see if all hosts have failed and the running result is not ok 18911 1727096297.07249: done checking to see if all hosts have failed 18911 1727096297.07250: getting the remaining hosts for this loop 18911 1727096297.07251: done getting the remaining hosts for this loop 18911 1727096297.07254: getting the next task for host managed_node1 18911 1727096297.07259: done getting next task for host managed_node1 18911 1727096297.07263: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096297.07265: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096297.07279: getting variables 18911 1727096297.07280: in VariableManager get_vars() 18911 1727096297.07316: Calling all_inventory to load vars for managed_node1 18911 1727096297.07319: Calling groups_inventory to load vars for managed_node1 18911 1727096297.07321: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096297.07331: Calling all_plugins_play to load vars for managed_node1 18911 1727096297.07333: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096297.07336: Calling groups_plugins_play to load vars for managed_node1 18911 1727096297.10206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096297.13955: done with get_vars() 18911 1727096297.13982: done getting variables 18911 1727096297.14049: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:58:17 -0400 (0:00:00.160) 0:00:16.255 ****** 18911 1727096297.14079: entering _queue_task() for managed_node1/service 18911 1727096297.14830: worker is 1 (out of 1 available) 18911 1727096297.14841: exiting _queue_task() for managed_node1/service 18911 1727096297.14851: done queuing things up, now waiting for results queue to drain 18911 1727096297.14852: waiting for pending results... 18911 1727096297.15273: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096297.15542: in run() - task 0afff68d-5257-09a7-aae1-000000000025 18911 1727096297.15564: variable 'ansible_search_path' from source: unknown 18911 1727096297.15577: variable 'ansible_search_path' from source: unknown 18911 1727096297.15619: calling self._execute() 18911 1727096297.15871: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096297.15884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096297.15900: variable 'omit' from source: magic vars 18911 1727096297.16755: variable 'ansible_distribution_major_version' from source: facts 18911 1727096297.16825: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096297.17153: variable 'network_provider' from source: set_fact 18911 1727096297.17199: variable 'network_state' from source: role '' defaults 18911 1727096297.17227: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18911 1727096297.17259: variable 'omit' from source: magic vars 18911 1727096297.17329: variable 'omit' from source: magic vars 18911 1727096297.17423: variable 'network_service_name' from source: role '' defaults 18911 1727096297.17875: variable 'network_service_name' from source: role '' defaults 18911 1727096297.18314: variable '__network_provider_setup' from source: role '' defaults 18911 1727096297.18318: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096297.18421: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096297.18495: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096297.18678: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096297.19275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096297.23340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096297.23435: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096297.23486: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096297.23527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096297.23574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096297.23671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.23706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.23736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.23803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.23853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.23915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.23950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.23989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.24054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.24085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.24346: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096297.24490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.24524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.24555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.24607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.24632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.24740: variable 'ansible_python' from source: facts 18911 1727096297.24774: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096297.24887: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096297.24983: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096297.25133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.25174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.25205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.25287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.25308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.25447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096297.25782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096297.25785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.25787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096297.25972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096297.26115: variable 'network_connections' from source: play vars 18911 1727096297.26129: variable 'interface' from source: set_fact 18911 1727096297.26209: variable 'interface' from source: set_fact 18911 1727096297.26341: variable 'interface' from source: set_fact 18911 1727096297.26547: variable 'interface' from source: set_fact 18911 1727096297.26648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096297.27199: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096297.27250: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096297.27299: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096297.27386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096297.27852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096297.27855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096297.27857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096297.27888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096297.27944: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096297.28700: variable 'network_connections' from source: play vars 18911 1727096297.28779: variable 'interface' from source: set_fact 18911 1727096297.29052: variable 'interface' from source: set_fact 18911 1727096297.29055: variable 'interface' from source: set_fact 18911 1727096297.29272: variable 'interface' from source: set_fact 18911 1727096297.29276: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096297.29409: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096297.30113: variable 'network_connections' from source: play vars 18911 1727096297.30117: variable 'interface' from source: set_fact 18911 1727096297.30192: variable 'interface' from source: set_fact 18911 1727096297.30199: variable 'interface' from source: set_fact 18911 1727096297.30264: variable 'interface' from source: set_fact 18911 1727096297.30856: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096297.31061: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096297.31604: variable 'network_connections' from source: play vars 18911 1727096297.31610: variable 'interface' from source: set_fact 18911 1727096297.31688: variable 'interface' from source: set_fact 18911 1727096297.31695: variable 'interface' from source: set_fact 18911 1727096297.32037: variable 'interface' from source: set_fact 18911 1727096297.32170: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096297.32229: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096297.32236: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096297.32465: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096297.32862: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096297.34278: variable 'network_connections' from source: play vars 18911 1727096297.34282: variable 'interface' from source: set_fact 18911 1727096297.34372: variable 'interface' from source: set_fact 18911 1727096297.34381: variable 'interface' from source: set_fact 18911 1727096297.34591: variable 'interface' from source: set_fact 18911 1727096297.34602: variable 'ansible_distribution' from source: facts 18911 1727096297.34605: variable '__network_rh_distros' from source: role '' defaults 18911 1727096297.34613: variable 'ansible_distribution_major_version' from source: facts 18911 1727096297.34638: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096297.35372: variable 'ansible_distribution' from source: facts 18911 1727096297.35381: variable '__network_rh_distros' from source: role '' defaults 18911 1727096297.35575: variable 'ansible_distribution_major_version' from source: facts 18911 1727096297.35578: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096297.35991: variable 'ansible_distribution' from source: facts 18911 1727096297.35995: variable '__network_rh_distros' from source: role '' defaults 18911 1727096297.36000: variable 'ansible_distribution_major_version' from source: facts 18911 1727096297.36176: variable 'network_provider' from source: set_fact 18911 1727096297.36184: variable 'omit' from source: magic vars 18911 1727096297.36215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096297.36363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096297.36380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096297.36398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096297.36409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096297.36436: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096297.36440: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096297.36442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096297.36757: Set connection var ansible_shell_executable to /bin/sh 18911 1727096297.36762: Set connection var ansible_timeout to 10 18911 1727096297.36771: Set connection var ansible_shell_type to sh 18911 1727096297.36780: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096297.36785: Set connection var ansible_pipelining to False 18911 1727096297.36985: Set connection var ansible_connection to ssh 18911 1727096297.37008: variable 'ansible_shell_executable' from source: unknown 18911 1727096297.37011: variable 'ansible_connection' from source: unknown 18911 1727096297.37015: variable 'ansible_module_compression' from source: unknown 18911 1727096297.37017: variable 'ansible_shell_type' from source: unknown 18911 1727096297.37019: variable 'ansible_shell_executable' from source: unknown 18911 1727096297.37021: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096297.37029: variable 'ansible_pipelining' from source: unknown 18911 1727096297.37031: variable 'ansible_timeout' from source: unknown 18911 1727096297.37033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096297.37408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096297.37419: variable 'omit' from source: magic vars 18911 1727096297.37422: starting attempt loop 18911 1727096297.37425: running the handler 18911 1727096297.37633: variable 'ansible_facts' from source: unknown 18911 1727096297.39576: _low_level_execute_command(): starting 18911 1727096297.39580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096297.40659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096297.40663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096297.40700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096297.40714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096297.40731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096297.40837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096297.42676: stdout chunk (state=3): >>>/root <<< 18911 1727096297.42726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096297.42732: stdout chunk (state=3): >>><<< 18911 1727096297.42740: stderr chunk (state=3): >>><<< 18911 1727096297.42846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096297.42859: _low_level_execute_command(): starting 18911 1727096297.42871: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126 `" && echo ansible-tmp-1727096297.4284637-19720-224973154971126="` echo /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126 `" ) && sleep 0' 18911 1727096297.44319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096297.44384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096297.44547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096297.44554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096297.44617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096297.47374: stdout chunk (state=3): >>>ansible-tmp-1727096297.4284637-19720-224973154971126=/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126 <<< 18911 1727096297.47378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096297.47380: stdout chunk (state=3): >>><<< 18911 1727096297.47383: stderr chunk (state=3): >>><<< 18911 1727096297.47394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096297.4284637-19720-224973154971126=/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096297.47433: variable 'ansible_module_compression' from source: unknown 18911 1727096297.47493: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 18911 1727096297.47497: ANSIBALLZ: Acquiring lock 18911 1727096297.47499: ANSIBALLZ: Lock acquired: 140481135532592 18911 1727096297.47502: ANSIBALLZ: Creating module 18911 1727096297.89634: ANSIBALLZ: Writing module into payload 18911 1727096297.89773: ANSIBALLZ: Writing module 18911 1727096297.89886: ANSIBALLZ: Renaming module 18911 1727096297.89895: ANSIBALLZ: Done creating module 18911 1727096297.89956: variable 'ansible_facts' from source: unknown 18911 1727096297.90307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py 18911 1727096297.90577: Sending initial data 18911 1727096297.90581: Sent initial data (156 bytes) 18911 1727096297.92031: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096297.92214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096297.92218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096297.92220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096297.92374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096297.93938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096297.94008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096297.94098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpcr9kgmi2 /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py <<< 18911 1727096297.94102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py" <<< 18911 1727096297.94198: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpcr9kgmi2" to remote "/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py" <<< 18911 1727096297.97563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096297.97631: stderr chunk (state=3): >>><<< 18911 1727096297.97645: stdout chunk (state=3): >>><<< 18911 1727096297.97671: done transferring module to remote 18911 1727096297.97681: _low_level_execute_command(): starting 18911 1727096297.97772: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/ /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py && sleep 0' 18911 1727096297.98317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096297.98326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096297.98381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096297.98417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096297.98432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096297.98448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096297.98548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096298.00521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096298.00544: stderr chunk (state=3): >>><<< 18911 1727096298.00553: stdout chunk (state=3): >>><<< 18911 1727096298.00627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096298.00631: _low_level_execute_command(): starting 18911 1727096298.00634: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/AnsiballZ_systemd.py && sleep 0' 18911 1727096298.01973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096298.02021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096298.02131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096298.32025: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10645504", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317911552", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "862570000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18911 1727096298.32033: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18911 1727096298.34095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096298.34099: stdout chunk (state=3): >>><<< 18911 1727096298.34101: stderr chunk (state=3): >>><<< 18911 1727096298.34104: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10645504", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317911552", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "862570000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096298.34391: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096298.34395: _low_level_execute_command(): starting 18911 1727096298.34398: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096297.4284637-19720-224973154971126/ > /dev/null 2>&1 && sleep 0' 18911 1727096298.35441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096298.35454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096298.35499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096298.35546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096298.35639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096298.35759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096298.35790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096298.36048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096298.37980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096298.37993: stdout chunk (state=3): >>><<< 18911 1727096298.38005: stderr chunk (state=3): >>><<< 18911 1727096298.38045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096298.38058: handler run complete 18911 1727096298.38398: attempt loop complete, returning result 18911 1727096298.38401: _execute() done 18911 1727096298.38403: dumping result to json 18911 1727096298.38405: done dumping result, returning 18911 1727096298.38490: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-09a7-aae1-000000000025] 18911 1727096298.38493: sending task result for task 0afff68d-5257-09a7-aae1-000000000025 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096298.39206: no more pending results, returning what we have 18911 1727096298.39210: results queue empty 18911 1727096298.39211: checking for any_errors_fatal 18911 1727096298.39219: done checking for any_errors_fatal 18911 1727096298.39229: checking for max_fail_percentage 18911 1727096298.39231: done checking for max_fail_percentage 18911 1727096298.39232: checking to see if all hosts have failed and the running result is not ok 18911 1727096298.39233: done checking to see if all hosts have failed 18911 1727096298.39234: getting the remaining hosts for this loop 18911 1727096298.39235: done getting the remaining hosts for this loop 18911 1727096298.39241: getting the next task for host managed_node1 18911 1727096298.39248: done getting next task for host managed_node1 18911 1727096298.39252: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096298.39254: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096298.39276: done sending task result for task 0afff68d-5257-09a7-aae1-000000000025 18911 1727096298.39280: WORKER PROCESS EXITING 18911 1727096298.39287: getting variables 18911 1727096298.39290: in VariableManager get_vars() 18911 1727096298.39324: Calling all_inventory to load vars for managed_node1 18911 1727096298.39327: Calling groups_inventory to load vars for managed_node1 18911 1727096298.39329: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096298.39400: Calling all_plugins_play to load vars for managed_node1 18911 1727096298.39403: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096298.39407: Calling groups_plugins_play to load vars for managed_node1 18911 1727096298.42932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096298.48756: done with get_vars() 18911 1727096298.48802: done getting variables 18911 1727096298.49238: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:58:18 -0400 (0:00:01.354) 0:00:17.609 ****** 18911 1727096298.49496: entering _queue_task() for managed_node1/service 18911 1727096298.50775: worker is 1 (out of 1 available) 18911 1727096298.50793: exiting _queue_task() for managed_node1/service 18911 1727096298.50806: done queuing things up, now waiting for results queue to drain 18911 1727096298.50808: waiting for pending results... 18911 1727096298.51452: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096298.51674: in run() - task 0afff68d-5257-09a7-aae1-000000000026 18911 1727096298.51797: variable 'ansible_search_path' from source: unknown 18911 1727096298.51804: variable 'ansible_search_path' from source: unknown 18911 1727096298.51904: calling self._execute() 18911 1727096298.52121: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096298.52126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096298.52229: variable 'omit' from source: magic vars 18911 1727096298.53275: variable 'ansible_distribution_major_version' from source: facts 18911 1727096298.53279: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096298.53436: variable 'network_provider' from source: set_fact 18911 1727096298.53448: Evaluated conditional (network_provider == "nm"): True 18911 1727096298.53591: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096298.53728: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096298.54155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096298.58300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096298.58386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096298.58431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096298.58489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096298.58522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096298.58621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096298.58666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096298.58706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096298.58752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096298.58792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096298.58845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096298.58884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096298.58923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096298.58971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096298.59004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096298.59051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096298.59100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096298.59124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096298.59209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096298.59219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096298.59356: variable 'network_connections' from source: play vars 18911 1727096298.59383: variable 'interface' from source: set_fact 18911 1727096298.59479: variable 'interface' from source: set_fact 18911 1727096298.59492: variable 'interface' from source: set_fact 18911 1727096298.59564: variable 'interface' from source: set_fact 18911 1727096298.59650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096298.59833: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096298.59977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096298.59980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096298.59983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096298.59990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096298.60018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096298.60049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096298.60096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096298.60150: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096298.60396: variable 'network_connections' from source: play vars 18911 1727096298.60414: variable 'interface' from source: set_fact 18911 1727096298.60521: variable 'interface' from source: set_fact 18911 1727096298.60524: variable 'interface' from source: set_fact 18911 1727096298.60558: variable 'interface' from source: set_fact 18911 1727096298.60740: Evaluated conditional (__network_wpa_supplicant_required): False 18911 1727096298.60743: when evaluation is False, skipping this task 18911 1727096298.60746: _execute() done 18911 1727096298.60756: dumping result to json 18911 1727096298.60759: done dumping result, returning 18911 1727096298.60764: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-09a7-aae1-000000000026] 18911 1727096298.60768: sending task result for task 0afff68d-5257-09a7-aae1-000000000026 18911 1727096298.60834: done sending task result for task 0afff68d-5257-09a7-aae1-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18911 1727096298.60888: no more pending results, returning what we have 18911 1727096298.60892: results queue empty 18911 1727096298.60893: checking for any_errors_fatal 18911 1727096298.60910: done checking for any_errors_fatal 18911 1727096298.60911: checking for max_fail_percentage 18911 1727096298.60912: done checking for max_fail_percentage 18911 1727096298.60913: checking to see if all hosts have failed and the running result is not ok 18911 1727096298.60914: done checking to see if all hosts have failed 18911 1727096298.60915: getting the remaining hosts for this loop 18911 1727096298.60916: done getting the remaining hosts for this loop 18911 1727096298.60919: getting the next task for host managed_node1 18911 1727096298.60926: done getting next task for host managed_node1 18911 1727096298.60930: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096298.60931: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096298.60945: getting variables 18911 1727096298.60946: in VariableManager get_vars() 18911 1727096298.60990: Calling all_inventory to load vars for managed_node1 18911 1727096298.60992: Calling groups_inventory to load vars for managed_node1 18911 1727096298.60994: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096298.61005: Calling all_plugins_play to load vars for managed_node1 18911 1727096298.61008: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096298.61010: Calling groups_plugins_play to load vars for managed_node1 18911 1727096298.61781: WORKER PROCESS EXITING 18911 1727096298.64006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096298.66085: done with get_vars() 18911 1727096298.66120: done getting variables 18911 1727096298.66255: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:58:18 -0400 (0:00:00.167) 0:00:17.777 ****** 18911 1727096298.66287: entering _queue_task() for managed_node1/service 18911 1727096298.67006: worker is 1 (out of 1 available) 18911 1727096298.67023: exiting _queue_task() for managed_node1/service 18911 1727096298.67041: done queuing things up, now waiting for results queue to drain 18911 1727096298.67043: waiting for pending results... 18911 1727096298.67634: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096298.67810: in run() - task 0afff68d-5257-09a7-aae1-000000000027 18911 1727096298.67834: variable 'ansible_search_path' from source: unknown 18911 1727096298.67842: variable 'ansible_search_path' from source: unknown 18911 1727096298.67907: calling self._execute() 18911 1727096298.68185: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096298.68218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096298.68265: variable 'omit' from source: magic vars 18911 1727096298.68774: variable 'ansible_distribution_major_version' from source: facts 18911 1727096298.68889: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096298.68917: variable 'network_provider' from source: set_fact 18911 1727096298.68928: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096298.68935: when evaluation is False, skipping this task 18911 1727096298.68942: _execute() done 18911 1727096298.68950: dumping result to json 18911 1727096298.68957: done dumping result, returning 18911 1727096298.68975: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-09a7-aae1-000000000027] 18911 1727096298.68994: sending task result for task 0afff68d-5257-09a7-aae1-000000000027 18911 1727096298.69217: done sending task result for task 0afff68d-5257-09a7-aae1-000000000027 18911 1727096298.69221: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096298.69269: no more pending results, returning what we have 18911 1727096298.69273: results queue empty 18911 1727096298.69274: checking for any_errors_fatal 18911 1727096298.69283: done checking for any_errors_fatal 18911 1727096298.69284: checking for max_fail_percentage 18911 1727096298.69285: done checking for max_fail_percentage 18911 1727096298.69286: checking to see if all hosts have failed and the running result is not ok 18911 1727096298.69287: done checking to see if all hosts have failed 18911 1727096298.69288: getting the remaining hosts for this loop 18911 1727096298.69289: done getting the remaining hosts for this loop 18911 1727096298.69292: getting the next task for host managed_node1 18911 1727096298.69299: done getting next task for host managed_node1 18911 1727096298.69303: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096298.69306: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096298.69321: getting variables 18911 1727096298.69323: in VariableManager get_vars() 18911 1727096298.69359: Calling all_inventory to load vars for managed_node1 18911 1727096298.69362: Calling groups_inventory to load vars for managed_node1 18911 1727096298.69364: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096298.69479: Calling all_plugins_play to load vars for managed_node1 18911 1727096298.69483: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096298.69486: Calling groups_plugins_play to load vars for managed_node1 18911 1727096298.72245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096298.74094: done with get_vars() 18911 1727096298.74123: done getting variables 18911 1727096298.74194: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:58:18 -0400 (0:00:00.079) 0:00:17.856 ****** 18911 1727096298.74222: entering _queue_task() for managed_node1/copy 18911 1727096298.74791: worker is 1 (out of 1 available) 18911 1727096298.74807: exiting _queue_task() for managed_node1/copy 18911 1727096298.74820: done queuing things up, now waiting for results queue to drain 18911 1727096298.74822: waiting for pending results... 18911 1727096298.75778: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096298.75820: in run() - task 0afff68d-5257-09a7-aae1-000000000028 18911 1727096298.75833: variable 'ansible_search_path' from source: unknown 18911 1727096298.75836: variable 'ansible_search_path' from source: unknown 18911 1727096298.76278: calling self._execute() 18911 1727096298.76405: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096298.76414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096298.76417: variable 'omit' from source: magic vars 18911 1727096298.77450: variable 'ansible_distribution_major_version' from source: facts 18911 1727096298.77473: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096298.77616: variable 'network_provider' from source: set_fact 18911 1727096298.77720: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096298.77724: when evaluation is False, skipping this task 18911 1727096298.77727: _execute() done 18911 1727096298.77730: dumping result to json 18911 1727096298.77732: done dumping result, returning 18911 1727096298.77736: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-09a7-aae1-000000000028] 18911 1727096298.77739: sending task result for task 0afff68d-5257-09a7-aae1-000000000028 18911 1727096298.77926: done sending task result for task 0afff68d-5257-09a7-aae1-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18911 1727096298.77976: no more pending results, returning what we have 18911 1727096298.77980: results queue empty 18911 1727096298.77981: checking for any_errors_fatal 18911 1727096298.77988: done checking for any_errors_fatal 18911 1727096298.77989: checking for max_fail_percentage 18911 1727096298.77991: done checking for max_fail_percentage 18911 1727096298.77992: checking to see if all hosts have failed and the running result is not ok 18911 1727096298.77992: done checking to see if all hosts have failed 18911 1727096298.77993: getting the remaining hosts for this loop 18911 1727096298.77995: done getting the remaining hosts for this loop 18911 1727096298.77998: getting the next task for host managed_node1 18911 1727096298.78005: done getting next task for host managed_node1 18911 1727096298.78008: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096298.78011: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096298.78024: getting variables 18911 1727096298.78026: in VariableManager get_vars() 18911 1727096298.78060: Calling all_inventory to load vars for managed_node1 18911 1727096298.78065: Calling groups_inventory to load vars for managed_node1 18911 1727096298.78069: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096298.78082: Calling all_plugins_play to load vars for managed_node1 18911 1727096298.78085: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096298.78088: Calling groups_plugins_play to load vars for managed_node1 18911 1727096298.78681: WORKER PROCESS EXITING 18911 1727096298.79916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096298.82157: done with get_vars() 18911 1727096298.82190: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:58:18 -0400 (0:00:00.080) 0:00:17.937 ****** 18911 1727096298.82286: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096298.82288: Creating lock for fedora.linux_system_roles.network_connections 18911 1727096298.82894: worker is 1 (out of 1 available) 18911 1727096298.82905: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096298.82916: done queuing things up, now waiting for results queue to drain 18911 1727096298.82918: waiting for pending results... 18911 1727096298.83488: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096298.83494: in run() - task 0afff68d-5257-09a7-aae1-000000000029 18911 1727096298.83497: variable 'ansible_search_path' from source: unknown 18911 1727096298.83500: variable 'ansible_search_path' from source: unknown 18911 1727096298.83502: calling self._execute() 18911 1727096298.83505: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096298.83508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096298.83511: variable 'omit' from source: magic vars 18911 1727096298.83879: variable 'ansible_distribution_major_version' from source: facts 18911 1727096298.83890: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096298.83897: variable 'omit' from source: magic vars 18911 1727096298.83937: variable 'omit' from source: magic vars 18911 1727096298.84100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096298.86499: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096298.86629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096298.86817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096298.86853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096298.86881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096298.87095: variable 'network_provider' from source: set_fact 18911 1727096298.87342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096298.87673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096298.87676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096298.87679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096298.87682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096298.87756: variable 'omit' from source: magic vars 18911 1727096298.88173: variable 'omit' from source: magic vars 18911 1727096298.88255: variable 'network_connections' from source: play vars 18911 1727096298.88274: variable 'interface' from source: set_fact 18911 1727096298.88332: variable 'interface' from source: set_fact 18911 1727096298.88338: variable 'interface' from source: set_fact 18911 1727096298.88514: variable 'interface' from source: set_fact 18911 1727096298.88770: variable 'omit' from source: magic vars 18911 1727096298.88884: variable '__lsr_ansible_managed' from source: task vars 18911 1727096298.88951: variable '__lsr_ansible_managed' from source: task vars 18911 1727096298.90185: Loaded config def from plugin (lookup/template) 18911 1727096298.90190: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18911 1727096298.90339: File lookup term: get_ansible_managed.j2 18911 1727096298.90343: variable 'ansible_search_path' from source: unknown 18911 1727096298.90346: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18911 1727096298.90361: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18911 1727096298.90440: variable 'ansible_search_path' from source: unknown 18911 1727096299.03191: variable 'ansible_managed' from source: unknown 18911 1727096299.03497: variable 'omit' from source: magic vars 18911 1727096299.03529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096299.03672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096299.03691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096299.03709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096299.03718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096299.03864: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096299.03870: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096299.03873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096299.04081: Set connection var ansible_shell_executable to /bin/sh 18911 1727096299.04089: Set connection var ansible_timeout to 10 18911 1727096299.04092: Set connection var ansible_shell_type to sh 18911 1727096299.04100: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096299.04105: Set connection var ansible_pipelining to False 18911 1727096299.04111: Set connection var ansible_connection to ssh 18911 1727096299.04134: variable 'ansible_shell_executable' from source: unknown 18911 1727096299.04137: variable 'ansible_connection' from source: unknown 18911 1727096299.04140: variable 'ansible_module_compression' from source: unknown 18911 1727096299.04142: variable 'ansible_shell_type' from source: unknown 18911 1727096299.04145: variable 'ansible_shell_executable' from source: unknown 18911 1727096299.04147: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096299.04151: variable 'ansible_pipelining' from source: unknown 18911 1727096299.04154: variable 'ansible_timeout' from source: unknown 18911 1727096299.04183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096299.04475: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096299.04486: variable 'omit' from source: magic vars 18911 1727096299.04489: starting attempt loop 18911 1727096299.04491: running the handler 18911 1727096299.04494: _low_level_execute_command(): starting 18911 1727096299.04496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096299.05819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096299.05885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096299.05924: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096299.06033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096299.06073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096299.06076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096299.06474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096299.07982: stdout chunk (state=3): >>>/root <<< 18911 1727096299.08095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096299.08121: stderr chunk (state=3): >>><<< 18911 1727096299.08236: stdout chunk (state=3): >>><<< 18911 1727096299.08241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096299.08244: _low_level_execute_command(): starting 18911 1727096299.08247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379 `" && echo ansible-tmp-1727096299.0820968-19800-274592390846379="` echo /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379 `" ) && sleep 0' 18911 1727096299.09675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096299.09678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096299.09745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096299.09810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096299.11844: stdout chunk (state=3): >>>ansible-tmp-1727096299.0820968-19800-274592390846379=/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379 <<< 18911 1727096299.11984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096299.12041: stdout chunk (state=3): >>><<< 18911 1727096299.12050: stderr chunk (state=3): >>><<< 18911 1727096299.12071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096299.0820968-19800-274592390846379=/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096299.12173: variable 'ansible_module_compression' from source: unknown 18911 1727096299.12388: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 18911 1727096299.12391: ANSIBALLZ: Acquiring lock 18911 1727096299.12393: ANSIBALLZ: Lock acquired: 140481135647184 18911 1727096299.12396: ANSIBALLZ: Creating module 18911 1727096299.54188: ANSIBALLZ: Writing module into payload 18911 1727096299.54518: ANSIBALLZ: Writing module 18911 1727096299.54522: ANSIBALLZ: Renaming module 18911 1727096299.54525: ANSIBALLZ: Done creating module 18911 1727096299.54627: variable 'ansible_facts' from source: unknown 18911 1727096299.54656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py 18911 1727096299.54885: Sending initial data 18911 1727096299.54889: Sent initial data (168 bytes) 18911 1727096299.55930: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096299.56175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096299.56188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096299.56198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096299.56369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096299.58012: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096299.58100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096299.58170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpzfqdnixs /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py <<< 18911 1727096299.58174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py" <<< 18911 1727096299.58254: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpzfqdnixs" to remote "/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py" <<< 18911 1727096299.59974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096299.60046: stderr chunk (state=3): >>><<< 18911 1727096299.60049: stdout chunk (state=3): >>><<< 18911 1727096299.60052: done transferring module to remote 18911 1727096299.60054: _low_level_execute_command(): starting 18911 1727096299.60064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/ /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py && sleep 0' 18911 1727096299.61197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096299.61302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096299.61422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096299.61470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096299.61613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096299.63541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096299.63548: stdout chunk (state=3): >>><<< 18911 1727096299.63556: stderr chunk (state=3): >>><<< 18911 1727096299.63672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096299.63786: _low_level_execute_command(): starting 18911 1727096299.63789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/AnsiballZ_network_connections.py && sleep 0' 18911 1727096299.64390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096299.64428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096299.64446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096299.64477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096299.64613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.12835: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18911 1727096300.15174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.15178: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 18911 1727096300.15181: stdout chunk (state=3): >>><<< 18911 1727096300.15183: stderr chunk (state=3): >>><<< 18911 1727096300.15186: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096300.15188: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096300.15191: _low_level_execute_command(): starting 18911 1727096300.15193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096299.0820968-19800-274592390846379/ > /dev/null 2>&1 && sleep 0' 18911 1727096300.15747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096300.15751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096300.15782: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.15785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.15788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.15843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096300.15847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.15852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.15923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.17978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.17982: stdout chunk (state=3): >>><<< 18911 1727096300.17984: stderr chunk (state=3): >>><<< 18911 1727096300.18000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096300.18015: handler run complete 18911 1727096300.18122: attempt loop complete, returning result 18911 1727096300.18126: _execute() done 18911 1727096300.18128: dumping result to json 18911 1727096300.18130: done dumping result, returning 18911 1727096300.18132: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-09a7-aae1-000000000029] 18911 1727096300.18134: sending task result for task 0afff68d-5257-09a7-aae1-000000000029 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active) 18911 1727096300.18342: no more pending results, returning what we have 18911 1727096300.18346: results queue empty 18911 1727096300.18347: checking for any_errors_fatal 18911 1727096300.18355: done checking for any_errors_fatal 18911 1727096300.18356: checking for max_fail_percentage 18911 1727096300.18357: done checking for max_fail_percentage 18911 1727096300.18358: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.18359: done checking to see if all hosts have failed 18911 1727096300.18360: getting the remaining hosts for this loop 18911 1727096300.18364: done getting the remaining hosts for this loop 18911 1727096300.18370: getting the next task for host managed_node1 18911 1727096300.18377: done getting next task for host managed_node1 18911 1727096300.18380: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096300.18382: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.18393: getting variables 18911 1727096300.18394: in VariableManager get_vars() 18911 1727096300.18430: Calling all_inventory to load vars for managed_node1 18911 1727096300.18432: Calling groups_inventory to load vars for managed_node1 18911 1727096300.18435: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.18446: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.18448: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.18452: Calling groups_plugins_play to load vars for managed_node1 18911 1727096300.19391: done sending task result for task 0afff68d-5257-09a7-aae1-000000000029 18911 1727096300.19394: WORKER PROCESS EXITING 18911 1727096300.20570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096300.22198: done with get_vars() 18911 1727096300.22228: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:58:20 -0400 (0:00:01.400) 0:00:19.337 ****** 18911 1727096300.22317: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096300.22319: Creating lock for fedora.linux_system_roles.network_state 18911 1727096300.22700: worker is 1 (out of 1 available) 18911 1727096300.22712: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096300.22724: done queuing things up, now waiting for results queue to drain 18911 1727096300.22725: waiting for pending results... 18911 1727096300.23029: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096300.23148: in run() - task 0afff68d-5257-09a7-aae1-00000000002a 18911 1727096300.23176: variable 'ansible_search_path' from source: unknown 18911 1727096300.23184: variable 'ansible_search_path' from source: unknown 18911 1727096300.23230: calling self._execute() 18911 1727096300.23339: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.23352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.23371: variable 'omit' from source: magic vars 18911 1727096300.23757: variable 'ansible_distribution_major_version' from source: facts 18911 1727096300.23778: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096300.23900: variable 'network_state' from source: role '' defaults 18911 1727096300.23917: Evaluated conditional (network_state != {}): False 18911 1727096300.23925: when evaluation is False, skipping this task 18911 1727096300.23933: _execute() done 18911 1727096300.23941: dumping result to json 18911 1727096300.23950: done dumping result, returning 18911 1727096300.23968: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-09a7-aae1-00000000002a] 18911 1727096300.23981: sending task result for task 0afff68d-5257-09a7-aae1-00000000002a 18911 1727096300.24396: done sending task result for task 0afff68d-5257-09a7-aae1-00000000002a 18911 1727096300.24400: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096300.24452: no more pending results, returning what we have 18911 1727096300.24455: results queue empty 18911 1727096300.24456: checking for any_errors_fatal 18911 1727096300.24472: done checking for any_errors_fatal 18911 1727096300.24473: checking for max_fail_percentage 18911 1727096300.24475: done checking for max_fail_percentage 18911 1727096300.24476: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.24476: done checking to see if all hosts have failed 18911 1727096300.24477: getting the remaining hosts for this loop 18911 1727096300.24479: done getting the remaining hosts for this loop 18911 1727096300.24482: getting the next task for host managed_node1 18911 1727096300.24490: done getting next task for host managed_node1 18911 1727096300.24493: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096300.24496: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.24512: getting variables 18911 1727096300.24514: in VariableManager get_vars() 18911 1727096300.24553: Calling all_inventory to load vars for managed_node1 18911 1727096300.24556: Calling groups_inventory to load vars for managed_node1 18911 1727096300.24559: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.24775: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.24778: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.24782: Calling groups_plugins_play to load vars for managed_node1 18911 1727096300.27324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096300.29472: done with get_vars() 18911 1727096300.29503: done getting variables 18911 1727096300.29566: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:58:20 -0400 (0:00:00.072) 0:00:19.410 ****** 18911 1727096300.29598: entering _queue_task() for managed_node1/debug 18911 1727096300.29937: worker is 1 (out of 1 available) 18911 1727096300.29949: exiting _queue_task() for managed_node1/debug 18911 1727096300.29964: done queuing things up, now waiting for results queue to drain 18911 1727096300.29965: waiting for pending results... 18911 1727096300.30252: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096300.30379: in run() - task 0afff68d-5257-09a7-aae1-00000000002b 18911 1727096300.30404: variable 'ansible_search_path' from source: unknown 18911 1727096300.30412: variable 'ansible_search_path' from source: unknown 18911 1727096300.30452: calling self._execute() 18911 1727096300.30547: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.30557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.30611: variable 'omit' from source: magic vars 18911 1727096300.30990: variable 'ansible_distribution_major_version' from source: facts 18911 1727096300.31008: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096300.31020: variable 'omit' from source: magic vars 18911 1727096300.31073: variable 'omit' from source: magic vars 18911 1727096300.31115: variable 'omit' from source: magic vars 18911 1727096300.31270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096300.31274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096300.31276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096300.31279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.31281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.31312: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096300.31322: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.31329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.31444: Set connection var ansible_shell_executable to /bin/sh 18911 1727096300.31456: Set connection var ansible_timeout to 10 18911 1727096300.31469: Set connection var ansible_shell_type to sh 18911 1727096300.31487: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096300.31497: Set connection var ansible_pipelining to False 18911 1727096300.31506: Set connection var ansible_connection to ssh 18911 1727096300.31533: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.31541: variable 'ansible_connection' from source: unknown 18911 1727096300.31549: variable 'ansible_module_compression' from source: unknown 18911 1727096300.31556: variable 'ansible_shell_type' from source: unknown 18911 1727096300.31567: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.31577: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.31587: variable 'ansible_pipelining' from source: unknown 18911 1727096300.31598: variable 'ansible_timeout' from source: unknown 18911 1727096300.31772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.31776: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096300.31779: variable 'omit' from source: magic vars 18911 1727096300.31781: starting attempt loop 18911 1727096300.31787: running the handler 18911 1727096300.31925: variable '__network_connections_result' from source: set_fact 18911 1727096300.31986: handler run complete 18911 1727096300.32013: attempt loop complete, returning result 18911 1727096300.32020: _execute() done 18911 1727096300.32027: dumping result to json 18911 1727096300.32033: done dumping result, returning 18911 1727096300.32046: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-09a7-aae1-00000000002b] 18911 1727096300.32054: sending task result for task 0afff68d-5257-09a7-aae1-00000000002b 18911 1727096300.32273: done sending task result for task 0afff68d-5257-09a7-aae1-00000000002b 18911 1727096300.32277: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active)" ] } 18911 1727096300.32341: no more pending results, returning what we have 18911 1727096300.32344: results queue empty 18911 1727096300.32345: checking for any_errors_fatal 18911 1727096300.32354: done checking for any_errors_fatal 18911 1727096300.32355: checking for max_fail_percentage 18911 1727096300.32357: done checking for max_fail_percentage 18911 1727096300.32358: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.32359: done checking to see if all hosts have failed 18911 1727096300.32359: getting the remaining hosts for this loop 18911 1727096300.32361: done getting the remaining hosts for this loop 18911 1727096300.32369: getting the next task for host managed_node1 18911 1727096300.32377: done getting next task for host managed_node1 18911 1727096300.32380: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096300.32383: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.32393: getting variables 18911 1727096300.32395: in VariableManager get_vars() 18911 1727096300.32432: Calling all_inventory to load vars for managed_node1 18911 1727096300.32435: Calling groups_inventory to load vars for managed_node1 18911 1727096300.32438: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.32449: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.32453: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.32456: Calling groups_plugins_play to load vars for managed_node1 18911 1727096300.34153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096300.35783: done with get_vars() 18911 1727096300.35809: done getting variables 18911 1727096300.35874: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:58:20 -0400 (0:00:00.063) 0:00:19.473 ****** 18911 1727096300.35904: entering _queue_task() for managed_node1/debug 18911 1727096300.36250: worker is 1 (out of 1 available) 18911 1727096300.36265: exiting _queue_task() for managed_node1/debug 18911 1727096300.36379: done queuing things up, now waiting for results queue to drain 18911 1727096300.36381: waiting for pending results... 18911 1727096300.36561: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096300.36874: in run() - task 0afff68d-5257-09a7-aae1-00000000002c 18911 1727096300.36878: variable 'ansible_search_path' from source: unknown 18911 1727096300.36881: variable 'ansible_search_path' from source: unknown 18911 1727096300.36884: calling self._execute() 18911 1727096300.36887: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.36890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.36893: variable 'omit' from source: magic vars 18911 1727096300.37254: variable 'ansible_distribution_major_version' from source: facts 18911 1727096300.37276: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096300.37287: variable 'omit' from source: magic vars 18911 1727096300.37323: variable 'omit' from source: magic vars 18911 1727096300.37370: variable 'omit' from source: magic vars 18911 1727096300.37414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096300.37457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096300.37487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096300.37508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.37522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.37559: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096300.37572: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.37580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.37687: Set connection var ansible_shell_executable to /bin/sh 18911 1727096300.37698: Set connection var ansible_timeout to 10 18911 1727096300.37705: Set connection var ansible_shell_type to sh 18911 1727096300.37716: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096300.37724: Set connection var ansible_pipelining to False 18911 1727096300.37732: Set connection var ansible_connection to ssh 18911 1727096300.37756: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.37770: variable 'ansible_connection' from source: unknown 18911 1727096300.37778: variable 'ansible_module_compression' from source: unknown 18911 1727096300.37785: variable 'ansible_shell_type' from source: unknown 18911 1727096300.37791: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.37797: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.37874: variable 'ansible_pipelining' from source: unknown 18911 1727096300.37877: variable 'ansible_timeout' from source: unknown 18911 1727096300.37879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.37957: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096300.37983: variable 'omit' from source: magic vars 18911 1727096300.37994: starting attempt loop 18911 1727096300.38001: running the handler 18911 1727096300.38051: variable '__network_connections_result' from source: set_fact 18911 1727096300.38140: variable '__network_connections_result' from source: set_fact 18911 1727096300.38272: handler run complete 18911 1727096300.38308: attempt loop complete, returning result 18911 1727096300.38316: _execute() done 18911 1727096300.38323: dumping result to json 18911 1727096300.38332: done dumping result, returning 18911 1727096300.38344: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-09a7-aae1-00000000002c] 18911 1727096300.38418: sending task result for task 0afff68d-5257-09a7-aae1-00000000002c 18911 1727096300.38493: done sending task result for task 0afff68d-5257-09a7-aae1-00000000002c 18911 1727096300.38496: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 310bfb25-4f10-4235-b65a-8e010daf3c53 (not-active)" ] } } 18911 1727096300.38610: no more pending results, returning what we have 18911 1727096300.38614: results queue empty 18911 1727096300.38614: checking for any_errors_fatal 18911 1727096300.38622: done checking for any_errors_fatal 18911 1727096300.38623: checking for max_fail_percentage 18911 1727096300.38625: done checking for max_fail_percentage 18911 1727096300.38626: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.38626: done checking to see if all hosts have failed 18911 1727096300.38627: getting the remaining hosts for this loop 18911 1727096300.38628: done getting the remaining hosts for this loop 18911 1727096300.38632: getting the next task for host managed_node1 18911 1727096300.38641: done getting next task for host managed_node1 18911 1727096300.38645: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096300.38647: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.38658: getting variables 18911 1727096300.38660: in VariableManager get_vars() 18911 1727096300.38698: Calling all_inventory to load vars for managed_node1 18911 1727096300.38701: Calling groups_inventory to load vars for managed_node1 18911 1727096300.38703: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.38714: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.38716: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.38719: Calling groups_plugins_play to load vars for managed_node1 18911 1727096300.40346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096300.41982: done with get_vars() 18911 1727096300.42008: done getting variables 18911 1727096300.42069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:58:20 -0400 (0:00:00.061) 0:00:19.535 ****** 18911 1727096300.42098: entering _queue_task() for managed_node1/debug 18911 1727096300.42412: worker is 1 (out of 1 available) 18911 1727096300.42426: exiting _queue_task() for managed_node1/debug 18911 1727096300.42438: done queuing things up, now waiting for results queue to drain 18911 1727096300.42439: waiting for pending results... 18911 1727096300.42796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096300.42836: in run() - task 0afff68d-5257-09a7-aae1-00000000002d 18911 1727096300.42856: variable 'ansible_search_path' from source: unknown 18911 1727096300.42869: variable 'ansible_search_path' from source: unknown 18911 1727096300.42916: calling self._execute() 18911 1727096300.43073: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.43077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.43080: variable 'omit' from source: magic vars 18911 1727096300.43423: variable 'ansible_distribution_major_version' from source: facts 18911 1727096300.43446: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096300.43583: variable 'network_state' from source: role '' defaults 18911 1727096300.43601: Evaluated conditional (network_state != {}): False 18911 1727096300.43609: when evaluation is False, skipping this task 18911 1727096300.43617: _execute() done 18911 1727096300.43624: dumping result to json 18911 1727096300.43631: done dumping result, returning 18911 1727096300.43647: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-09a7-aae1-00000000002d] 18911 1727096300.43661: sending task result for task 0afff68d-5257-09a7-aae1-00000000002d 18911 1727096300.43931: done sending task result for task 0afff68d-5257-09a7-aae1-00000000002d 18911 1727096300.43934: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18911 1727096300.43982: no more pending results, returning what we have 18911 1727096300.43985: results queue empty 18911 1727096300.43986: checking for any_errors_fatal 18911 1727096300.43993: done checking for any_errors_fatal 18911 1727096300.43994: checking for max_fail_percentage 18911 1727096300.43996: done checking for max_fail_percentage 18911 1727096300.43997: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.43997: done checking to see if all hosts have failed 18911 1727096300.43998: getting the remaining hosts for this loop 18911 1727096300.43999: done getting the remaining hosts for this loop 18911 1727096300.44002: getting the next task for host managed_node1 18911 1727096300.44008: done getting next task for host managed_node1 18911 1727096300.44011: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096300.44013: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.44028: getting variables 18911 1727096300.44030: in VariableManager get_vars() 18911 1727096300.44066: Calling all_inventory to load vars for managed_node1 18911 1727096300.44071: Calling groups_inventory to load vars for managed_node1 18911 1727096300.44074: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.44084: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.44087: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.44090: Calling groups_plugins_play to load vars for managed_node1 18911 1727096300.45613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096300.47178: done with get_vars() 18911 1727096300.47205: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:58:20 -0400 (0:00:00.052) 0:00:19.587 ****** 18911 1727096300.47310: entering _queue_task() for managed_node1/ping 18911 1727096300.47312: Creating lock for ping 18911 1727096300.47681: worker is 1 (out of 1 available) 18911 1727096300.47693: exiting _queue_task() for managed_node1/ping 18911 1727096300.47705: done queuing things up, now waiting for results queue to drain 18911 1727096300.47707: waiting for pending results... 18911 1727096300.48088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096300.48103: in run() - task 0afff68d-5257-09a7-aae1-00000000002e 18911 1727096300.48123: variable 'ansible_search_path' from source: unknown 18911 1727096300.48131: variable 'ansible_search_path' from source: unknown 18911 1727096300.48181: calling self._execute() 18911 1727096300.48288: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.48303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.48320: variable 'omit' from source: magic vars 18911 1727096300.48714: variable 'ansible_distribution_major_version' from source: facts 18911 1727096300.48737: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096300.48749: variable 'omit' from source: magic vars 18911 1727096300.48795: variable 'omit' from source: magic vars 18911 1727096300.48841: variable 'omit' from source: magic vars 18911 1727096300.48892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096300.48936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096300.48972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096300.49058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.49061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096300.49066: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096300.49070: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.49073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.49173: Set connection var ansible_shell_executable to /bin/sh 18911 1727096300.49186: Set connection var ansible_timeout to 10 18911 1727096300.49194: Set connection var ansible_shell_type to sh 18911 1727096300.49204: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096300.49212: Set connection var ansible_pipelining to False 18911 1727096300.49221: Set connection var ansible_connection to ssh 18911 1727096300.49247: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.49254: variable 'ansible_connection' from source: unknown 18911 1727096300.49261: variable 'ansible_module_compression' from source: unknown 18911 1727096300.49275: variable 'ansible_shell_type' from source: unknown 18911 1727096300.49381: variable 'ansible_shell_executable' from source: unknown 18911 1727096300.49384: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096300.49386: variable 'ansible_pipelining' from source: unknown 18911 1727096300.49388: variable 'ansible_timeout' from source: unknown 18911 1727096300.49390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096300.49511: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096300.49528: variable 'omit' from source: magic vars 18911 1727096300.49539: starting attempt loop 18911 1727096300.49547: running the handler 18911 1727096300.49571: _low_level_execute_command(): starting 18911 1727096300.49585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096300.50155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096300.50167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.50197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096300.50201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.50251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096300.50255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.50259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.50333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.52068: stdout chunk (state=3): >>>/root <<< 18911 1727096300.52225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.52229: stdout chunk (state=3): >>><<< 18911 1727096300.52231: stderr chunk (state=3): >>><<< 18911 1727096300.52258: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096300.52293: _low_level_execute_command(): starting 18911 1727096300.52297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836 `" && echo ansible-tmp-1727096300.522652-19881-233453067335836="` echo /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836 `" ) && sleep 0' 18911 1727096300.52734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.52741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096300.52767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.52779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096300.52782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.52825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096300.52829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.52902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.54930: stdout chunk (state=3): >>>ansible-tmp-1727096300.522652-19881-233453067335836=/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836 <<< 18911 1727096300.55085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.55089: stdout chunk (state=3): >>><<< 18911 1727096300.55091: stderr chunk (state=3): >>><<< 18911 1727096300.55109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096300.522652-19881-233453067335836=/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096300.55277: variable 'ansible_module_compression' from source: unknown 18911 1727096300.55280: ANSIBALLZ: Using lock for ping 18911 1727096300.55282: ANSIBALLZ: Acquiring lock 18911 1727096300.55285: ANSIBALLZ: Lock acquired: 140481135649776 18911 1727096300.55287: ANSIBALLZ: Creating module 18911 1727096300.68776: ANSIBALLZ: Writing module into payload 18911 1727096300.68842: ANSIBALLZ: Writing module 18911 1727096300.68864: ANSIBALLZ: Renaming module 18911 1727096300.68876: ANSIBALLZ: Done creating module 18911 1727096300.68893: variable 'ansible_facts' from source: unknown 18911 1727096300.68980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py 18911 1727096300.69248: Sending initial data 18911 1727096300.69251: Sent initial data (152 bytes) 18911 1727096300.70157: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096300.70173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.70182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096300.70198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096300.70217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096300.70224: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096300.70235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.70251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096300.70330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.70351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096300.70415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.70419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.70499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.72234: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096300.72253: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096300.72307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096300.72393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpt90ca9ey /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py <<< 18911 1727096300.72397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py" <<< 18911 1727096300.72459: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpt90ca9ey" to remote "/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py" <<< 18911 1727096300.73285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.73474: stderr chunk (state=3): >>><<< 18911 1727096300.73478: stdout chunk (state=3): >>><<< 18911 1727096300.73480: done transferring module to remote 18911 1727096300.73482: _low_level_execute_command(): starting 18911 1727096300.73485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/ /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py && sleep 0' 18911 1727096300.74095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096300.74110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.74125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096300.74156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096300.74269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.74294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.74396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.76360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.76401: stdout chunk (state=3): >>><<< 18911 1727096300.76404: stderr chunk (state=3): >>><<< 18911 1727096300.76421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096300.76430: _low_level_execute_command(): starting 18911 1727096300.76516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/AnsiballZ_ping.py && sleep 0' 18911 1727096300.77111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096300.77233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.77385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.77490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.77530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.77633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.93279: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18911 1727096300.94690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096300.94714: stdout chunk (state=3): >>><<< 18911 1727096300.94726: stderr chunk (state=3): >>><<< 18911 1727096300.94747: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096300.94779: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096300.94794: _low_level_execute_command(): starting 18911 1727096300.94803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096300.522652-19881-233453067335836/ > /dev/null 2>&1 && sleep 0' 18911 1727096300.95461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096300.95515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096300.95582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096300.95646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096300.95664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096300.95694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096300.95787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096300.97906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096300.97926: stderr chunk (state=3): >>><<< 18911 1727096300.97944: stdout chunk (state=3): >>><<< 18911 1727096300.97973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096300.97982: handler run complete 18911 1727096300.98056: attempt loop complete, returning result 18911 1727096300.98060: _execute() done 18911 1727096300.98062: dumping result to json 18911 1727096300.98072: done dumping result, returning 18911 1727096300.98094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-09a7-aae1-00000000002e] 18911 1727096300.98163: sending task result for task 0afff68d-5257-09a7-aae1-00000000002e ok: [managed_node1] => { "changed": false, "ping": "pong" } 18911 1727096300.98298: no more pending results, returning what we have 18911 1727096300.98301: results queue empty 18911 1727096300.98302: checking for any_errors_fatal 18911 1727096300.98308: done checking for any_errors_fatal 18911 1727096300.98309: checking for max_fail_percentage 18911 1727096300.98310: done checking for max_fail_percentage 18911 1727096300.98311: checking to see if all hosts have failed and the running result is not ok 18911 1727096300.98311: done checking to see if all hosts have failed 18911 1727096300.98312: getting the remaining hosts for this loop 18911 1727096300.98313: done getting the remaining hosts for this loop 18911 1727096300.98316: getting the next task for host managed_node1 18911 1727096300.98323: done getting next task for host managed_node1 18911 1727096300.98325: ^ task is: TASK: meta (role_complete) 18911 1727096300.98327: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096300.98334: getting variables 18911 1727096300.98336: in VariableManager get_vars() 18911 1727096300.98375: Calling all_inventory to load vars for managed_node1 18911 1727096300.98378: Calling groups_inventory to load vars for managed_node1 18911 1727096300.98380: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096300.98387: done sending task result for task 0afff68d-5257-09a7-aae1-00000000002e 18911 1727096300.98391: WORKER PROCESS EXITING 18911 1727096300.98485: Calling all_plugins_play to load vars for managed_node1 18911 1727096300.98489: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096300.98492: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.00052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.02559: done with get_vars() 18911 1727096301.02699: done getting variables 18911 1727096301.02780: done queuing things up, now waiting for results queue to drain 18911 1727096301.02869: results queue empty 18911 1727096301.02871: checking for any_errors_fatal 18911 1727096301.02874: done checking for any_errors_fatal 18911 1727096301.02875: checking for max_fail_percentage 18911 1727096301.02876: done checking for max_fail_percentage 18911 1727096301.02877: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.02877: done checking to see if all hosts have failed 18911 1727096301.02878: getting the remaining hosts for this loop 18911 1727096301.02879: done getting the remaining hosts for this loop 18911 1727096301.02882: getting the next task for host managed_node1 18911 1727096301.02886: done getting next task for host managed_node1 18911 1727096301.02889: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18911 1727096301.02970: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.02974: getting variables 18911 1727096301.02975: in VariableManager get_vars() 18911 1727096301.02988: Calling all_inventory to load vars for managed_node1 18911 1727096301.02990: Calling groups_inventory to load vars for managed_node1 18911 1727096301.02992: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.02997: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.03005: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.03008: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.05344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.08535: done with get_vars() 18911 1727096301.08562: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Monday 23 September 2024 08:58:21 -0400 (0:00:00.615) 0:00:20.202 ****** 18911 1727096301.08855: entering _queue_task() for managed_node1/include_tasks 18911 1727096301.09464: worker is 1 (out of 1 available) 18911 1727096301.09478: exiting _queue_task() for managed_node1/include_tasks 18911 1727096301.09489: done queuing things up, now waiting for results queue to drain 18911 1727096301.09491: waiting for pending results... 18911 1727096301.10185: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18911 1727096301.10191: in run() - task 0afff68d-5257-09a7-aae1-000000000030 18911 1727096301.10194: variable 'ansible_search_path' from source: unknown 18911 1727096301.10197: calling self._execute() 18911 1727096301.10469: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.10482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.10502: variable 'omit' from source: magic vars 18911 1727096301.11300: variable 'ansible_distribution_major_version' from source: facts 18911 1727096301.11318: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096301.11329: _execute() done 18911 1727096301.11773: dumping result to json 18911 1727096301.11777: done dumping result, returning 18911 1727096301.11780: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0afff68d-5257-09a7-aae1-000000000030] 18911 1727096301.11782: sending task result for task 0afff68d-5257-09a7-aae1-000000000030 18911 1727096301.11856: done sending task result for task 0afff68d-5257-09a7-aae1-000000000030 18911 1727096301.11860: WORKER PROCESS EXITING 18911 1727096301.11893: no more pending results, returning what we have 18911 1727096301.11898: in VariableManager get_vars() 18911 1727096301.11940: Calling all_inventory to load vars for managed_node1 18911 1727096301.11943: Calling groups_inventory to load vars for managed_node1 18911 1727096301.11945: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.11955: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.11957: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.11959: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.15010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.18226: done with get_vars() 18911 1727096301.18251: variable 'ansible_search_path' from source: unknown 18911 1727096301.18269: we have included files to process 18911 1727096301.18271: generating all_blocks data 18911 1727096301.18273: done generating all_blocks data 18911 1727096301.18391: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18911 1727096301.18393: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18911 1727096301.18397: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18911 1727096301.18962: done processing included file 18911 1727096301.18964: iterating over new_blocks loaded from include file 18911 1727096301.18965: in VariableManager get_vars() 18911 1727096301.18983: done with get_vars() 18911 1727096301.19069: filtering new block on tags 18911 1727096301.19088: done filtering new block on tags 18911 1727096301.19091: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node1 18911 1727096301.19100: extending task lists for all hosts with included blocks 18911 1727096301.19131: done extending task lists 18911 1727096301.19132: done processing included files 18911 1727096301.19133: results queue empty 18911 1727096301.19133: checking for any_errors_fatal 18911 1727096301.19135: done checking for any_errors_fatal 18911 1727096301.19136: checking for max_fail_percentage 18911 1727096301.19137: done checking for max_fail_percentage 18911 1727096301.19138: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.19138: done checking to see if all hosts have failed 18911 1727096301.19139: getting the remaining hosts for this loop 18911 1727096301.19140: done getting the remaining hosts for this loop 18911 1727096301.19142: getting the next task for host managed_node1 18911 1727096301.19146: done getting next task for host managed_node1 18911 1727096301.19148: ^ task is: TASK: Assert that warnings is empty 18911 1727096301.19150: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.19152: getting variables 18911 1727096301.19153: in VariableManager get_vars() 18911 1727096301.19164: Calling all_inventory to load vars for managed_node1 18911 1727096301.19166: Calling groups_inventory to load vars for managed_node1 18911 1727096301.19318: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.19324: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.19327: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.19330: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.21652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.25094: done with get_vars() 18911 1727096301.25118: done getting variables 18911 1727096301.25165: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Monday 23 September 2024 08:58:21 -0400 (0:00:00.163) 0:00:20.366 ****** 18911 1727096301.25196: entering _queue_task() for managed_node1/assert 18911 1727096301.25864: worker is 1 (out of 1 available) 18911 1727096301.25880: exiting _queue_task() for managed_node1/assert 18911 1727096301.25892: done queuing things up, now waiting for results queue to drain 18911 1727096301.25894: waiting for pending results... 18911 1727096301.26352: running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty 18911 1727096301.26528: in run() - task 0afff68d-5257-09a7-aae1-000000000304 18911 1727096301.26541: variable 'ansible_search_path' from source: unknown 18911 1727096301.26545: variable 'ansible_search_path' from source: unknown 18911 1727096301.26620: calling self._execute() 18911 1727096301.26715: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.26833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.26939: variable 'omit' from source: magic vars 18911 1727096301.27560: variable 'ansible_distribution_major_version' from source: facts 18911 1727096301.27574: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096301.27580: variable 'omit' from source: magic vars 18911 1727096301.27645: variable 'omit' from source: magic vars 18911 1727096301.27925: variable 'omit' from source: magic vars 18911 1727096301.27966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096301.28116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096301.28142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096301.28155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.28373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.28377: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096301.28380: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.28383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.28505: Set connection var ansible_shell_executable to /bin/sh 18911 1727096301.28511: Set connection var ansible_timeout to 10 18911 1727096301.28514: Set connection var ansible_shell_type to sh 18911 1727096301.28522: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096301.28527: Set connection var ansible_pipelining to False 18911 1727096301.28651: Set connection var ansible_connection to ssh 18911 1727096301.28677: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.28681: variable 'ansible_connection' from source: unknown 18911 1727096301.28684: variable 'ansible_module_compression' from source: unknown 18911 1727096301.28687: variable 'ansible_shell_type' from source: unknown 18911 1727096301.28689: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.28692: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.28694: variable 'ansible_pipelining' from source: unknown 18911 1727096301.28696: variable 'ansible_timeout' from source: unknown 18911 1727096301.28701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.28950: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096301.28961: variable 'omit' from source: magic vars 18911 1727096301.29078: starting attempt loop 18911 1727096301.29082: running the handler 18911 1727096301.29317: variable '__network_connections_result' from source: set_fact 18911 1727096301.29331: Evaluated conditional ('warnings' not in __network_connections_result): True 18911 1727096301.29334: handler run complete 18911 1727096301.29350: attempt loop complete, returning result 18911 1727096301.29354: _execute() done 18911 1727096301.29357: dumping result to json 18911 1727096301.29359: done dumping result, returning 18911 1727096301.29371: done running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty [0afff68d-5257-09a7-aae1-000000000304] 18911 1727096301.29514: sending task result for task 0afff68d-5257-09a7-aae1-000000000304 18911 1727096301.29656: done sending task result for task 0afff68d-5257-09a7-aae1-000000000304 18911 1727096301.29659: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18911 1727096301.29739: no more pending results, returning what we have 18911 1727096301.29743: results queue empty 18911 1727096301.29744: checking for any_errors_fatal 18911 1727096301.29746: done checking for any_errors_fatal 18911 1727096301.29746: checking for max_fail_percentage 18911 1727096301.29748: done checking for max_fail_percentage 18911 1727096301.29750: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.29750: done checking to see if all hosts have failed 18911 1727096301.29751: getting the remaining hosts for this loop 18911 1727096301.29753: done getting the remaining hosts for this loop 18911 1727096301.29756: getting the next task for host managed_node1 18911 1727096301.29767: done getting next task for host managed_node1 18911 1727096301.29773: ^ task is: TASK: Assert that there is output in stderr 18911 1727096301.29776: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.29779: getting variables 18911 1727096301.29781: in VariableManager get_vars() 18911 1727096301.29821: Calling all_inventory to load vars for managed_node1 18911 1727096301.29824: Calling groups_inventory to load vars for managed_node1 18911 1727096301.29827: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.29839: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.29843: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.29846: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.38979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.43675: done with get_vars() 18911 1727096301.43700: done getting variables 18911 1727096301.43747: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Monday 23 September 2024 08:58:21 -0400 (0:00:00.186) 0:00:20.553 ****** 18911 1727096301.43879: entering _queue_task() for managed_node1/assert 18911 1727096301.45026: worker is 1 (out of 1 available) 18911 1727096301.45039: exiting _queue_task() for managed_node1/assert 18911 1727096301.45052: done queuing things up, now waiting for results queue to drain 18911 1727096301.45053: waiting for pending results... 18911 1727096301.45787: running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr 18911 1727096301.45792: in run() - task 0afff68d-5257-09a7-aae1-000000000305 18911 1727096301.45798: variable 'ansible_search_path' from source: unknown 18911 1727096301.45801: variable 'ansible_search_path' from source: unknown 18911 1727096301.46173: calling self._execute() 18911 1727096301.46177: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.46180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.46183: variable 'omit' from source: magic vars 18911 1727096301.46973: variable 'ansible_distribution_major_version' from source: facts 18911 1727096301.46976: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096301.46982: variable 'omit' from source: magic vars 18911 1727096301.46985: variable 'omit' from source: magic vars 18911 1727096301.46988: variable 'omit' from source: magic vars 18911 1727096301.47372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096301.47376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096301.47379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096301.47381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.47383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.47386: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096301.47389: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.47391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.47610: Set connection var ansible_shell_executable to /bin/sh 18911 1727096301.47622: Set connection var ansible_timeout to 10 18911 1727096301.47630: Set connection var ansible_shell_type to sh 18911 1727096301.47644: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096301.47653: Set connection var ansible_pipelining to False 18911 1727096301.47662: Set connection var ansible_connection to ssh 18911 1727096301.47690: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.47698: variable 'ansible_connection' from source: unknown 18911 1727096301.47705: variable 'ansible_module_compression' from source: unknown 18911 1727096301.47711: variable 'ansible_shell_type' from source: unknown 18911 1727096301.47718: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.47725: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.47733: variable 'ansible_pipelining' from source: unknown 18911 1727096301.47739: variable 'ansible_timeout' from source: unknown 18911 1727096301.47747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.48094: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096301.48109: variable 'omit' from source: magic vars 18911 1727096301.48119: starting attempt loop 18911 1727096301.48127: running the handler 18911 1727096301.48253: variable '__network_connections_result' from source: set_fact 18911 1727096301.48672: Evaluated conditional ('stderr' in __network_connections_result): True 18911 1727096301.48676: handler run complete 18911 1727096301.48678: attempt loop complete, returning result 18911 1727096301.48681: _execute() done 18911 1727096301.48683: dumping result to json 18911 1727096301.48685: done dumping result, returning 18911 1727096301.48687: done running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr [0afff68d-5257-09a7-aae1-000000000305] 18911 1727096301.48689: sending task result for task 0afff68d-5257-09a7-aae1-000000000305 18911 1727096301.48755: done sending task result for task 0afff68d-5257-09a7-aae1-000000000305 18911 1727096301.48760: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18911 1727096301.48814: no more pending results, returning what we have 18911 1727096301.48817: results queue empty 18911 1727096301.48818: checking for any_errors_fatal 18911 1727096301.48827: done checking for any_errors_fatal 18911 1727096301.48827: checking for max_fail_percentage 18911 1727096301.48829: done checking for max_fail_percentage 18911 1727096301.48830: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.48831: done checking to see if all hosts have failed 18911 1727096301.48831: getting the remaining hosts for this loop 18911 1727096301.48833: done getting the remaining hosts for this loop 18911 1727096301.48836: getting the next task for host managed_node1 18911 1727096301.48846: done getting next task for host managed_node1 18911 1727096301.48849: ^ task is: TASK: meta (flush_handlers) 18911 1727096301.48850: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.48856: getting variables 18911 1727096301.48858: in VariableManager get_vars() 18911 1727096301.48902: Calling all_inventory to load vars for managed_node1 18911 1727096301.48906: Calling groups_inventory to load vars for managed_node1 18911 1727096301.48909: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.48919: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.48922: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.48925: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.52746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.56008: done with get_vars() 18911 1727096301.56033: done getting variables 18911 1727096301.56311: in VariableManager get_vars() 18911 1727096301.56324: Calling all_inventory to load vars for managed_node1 18911 1727096301.56327: Calling groups_inventory to load vars for managed_node1 18911 1727096301.56329: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.56335: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.56337: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.56340: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.60070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.64895: done with get_vars() 18911 1727096301.64929: done queuing things up, now waiting for results queue to drain 18911 1727096301.64931: results queue empty 18911 1727096301.64932: checking for any_errors_fatal 18911 1727096301.64935: done checking for any_errors_fatal 18911 1727096301.64935: checking for max_fail_percentage 18911 1727096301.64937: done checking for max_fail_percentage 18911 1727096301.64937: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.64938: done checking to see if all hosts have failed 18911 1727096301.64939: getting the remaining hosts for this loop 18911 1727096301.64946: done getting the remaining hosts for this loop 18911 1727096301.64949: getting the next task for host managed_node1 18911 1727096301.64952: done getting next task for host managed_node1 18911 1727096301.64954: ^ task is: TASK: meta (flush_handlers) 18911 1727096301.64955: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.64958: getting variables 18911 1727096301.64959: in VariableManager get_vars() 18911 1727096301.65179: Calling all_inventory to load vars for managed_node1 18911 1727096301.65181: Calling groups_inventory to load vars for managed_node1 18911 1727096301.65183: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.65189: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.65191: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.65194: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.67800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.71050: done with get_vars() 18911 1727096301.71077: done getting variables 18911 1727096301.71132: in VariableManager get_vars() 18911 1727096301.71146: Calling all_inventory to load vars for managed_node1 18911 1727096301.71148: Calling groups_inventory to load vars for managed_node1 18911 1727096301.71151: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.71156: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.71158: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.71161: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.72347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.75072: done with get_vars() 18911 1727096301.75113: done queuing things up, now waiting for results queue to drain 18911 1727096301.75115: results queue empty 18911 1727096301.75116: checking for any_errors_fatal 18911 1727096301.75117: done checking for any_errors_fatal 18911 1727096301.75118: checking for max_fail_percentage 18911 1727096301.75119: done checking for max_fail_percentage 18911 1727096301.75120: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.75121: done checking to see if all hosts have failed 18911 1727096301.75121: getting the remaining hosts for this loop 18911 1727096301.75122: done getting the remaining hosts for this loop 18911 1727096301.75125: getting the next task for host managed_node1 18911 1727096301.75128: done getting next task for host managed_node1 18911 1727096301.75129: ^ task is: None 18911 1727096301.75131: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.75132: done queuing things up, now waiting for results queue to drain 18911 1727096301.75133: results queue empty 18911 1727096301.75133: checking for any_errors_fatal 18911 1727096301.75134: done checking for any_errors_fatal 18911 1727096301.75135: checking for max_fail_percentage 18911 1727096301.75136: done checking for max_fail_percentage 18911 1727096301.75136: checking to see if all hosts have failed and the running result is not ok 18911 1727096301.75137: done checking to see if all hosts have failed 18911 1727096301.75138: getting the next task for host managed_node1 18911 1727096301.75140: done getting next task for host managed_node1 18911 1727096301.75141: ^ task is: None 18911 1727096301.75142: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.75200: in VariableManager get_vars() 18911 1727096301.75217: done with get_vars() 18911 1727096301.75222: in VariableManager get_vars() 18911 1727096301.75230: done with get_vars() 18911 1727096301.75234: variable 'omit' from source: magic vars 18911 1727096301.75265: in VariableManager get_vars() 18911 1727096301.75278: done with get_vars() 18911 1727096301.75414: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18911 1727096301.75932: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096301.76073: getting the remaining hosts for this loop 18911 1727096301.76074: done getting the remaining hosts for this loop 18911 1727096301.76077: getting the next task for host managed_node1 18911 1727096301.76080: done getting next task for host managed_node1 18911 1727096301.76082: ^ task is: TASK: Gathering Facts 18911 1727096301.76083: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096301.76085: getting variables 18911 1727096301.76086: in VariableManager get_vars() 18911 1727096301.76095: Calling all_inventory to load vars for managed_node1 18911 1727096301.76097: Calling groups_inventory to load vars for managed_node1 18911 1727096301.76100: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096301.76105: Calling all_plugins_play to load vars for managed_node1 18911 1727096301.76108: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096301.76111: Calling groups_plugins_play to load vars for managed_node1 18911 1727096301.78521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096301.80922: done with get_vars() 18911 1727096301.80946: done getting variables 18911 1727096301.80997: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Monday 23 September 2024 08:58:21 -0400 (0:00:00.371) 0:00:20.924 ****** 18911 1727096301.81022: entering _queue_task() for managed_node1/gather_facts 18911 1727096301.81364: worker is 1 (out of 1 available) 18911 1727096301.81481: exiting _queue_task() for managed_node1/gather_facts 18911 1727096301.81491: done queuing things up, now waiting for results queue to drain 18911 1727096301.81493: waiting for pending results... 18911 1727096301.81664: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096301.81785: in run() - task 0afff68d-5257-09a7-aae1-000000000316 18911 1727096301.81811: variable 'ansible_search_path' from source: unknown 18911 1727096301.81862: calling self._execute() 18911 1727096301.81973: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.81987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.82001: variable 'omit' from source: magic vars 18911 1727096301.82415: variable 'ansible_distribution_major_version' from source: facts 18911 1727096301.82434: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096301.82445: variable 'omit' from source: magic vars 18911 1727096301.82485: variable 'omit' from source: magic vars 18911 1727096301.82524: variable 'omit' from source: magic vars 18911 1727096301.82574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096301.82618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096301.82646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096301.82675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.82695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096301.82729: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096301.82738: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.82746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.82852: Set connection var ansible_shell_executable to /bin/sh 18911 1727096301.82864: Set connection var ansible_timeout to 10 18911 1727096301.82874: Set connection var ansible_shell_type to sh 18911 1727096301.82890: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096301.82899: Set connection var ansible_pipelining to False 18911 1727096301.82912: Set connection var ansible_connection to ssh 18911 1727096301.82972: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.82976: variable 'ansible_connection' from source: unknown 18911 1727096301.82978: variable 'ansible_module_compression' from source: unknown 18911 1727096301.82981: variable 'ansible_shell_type' from source: unknown 18911 1727096301.82983: variable 'ansible_shell_executable' from source: unknown 18911 1727096301.82985: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096301.82988: variable 'ansible_pipelining' from source: unknown 18911 1727096301.82990: variable 'ansible_timeout' from source: unknown 18911 1727096301.82997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096301.83171: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096301.83216: variable 'omit' from source: magic vars 18911 1727096301.83220: starting attempt loop 18911 1727096301.83222: running the handler 18911 1727096301.83227: variable 'ansible_facts' from source: unknown 18911 1727096301.83255: _low_level_execute_command(): starting 18911 1727096301.83270: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096301.84095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096301.84116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096301.84140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096301.84156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096301.84308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096301.86120: stdout chunk (state=3): >>>/root <<< 18911 1727096301.86181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096301.86273: stderr chunk (state=3): >>><<< 18911 1727096301.86277: stdout chunk (state=3): >>><<< 18911 1727096301.86280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096301.86283: _low_level_execute_command(): starting 18911 1727096301.86426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206 `" && echo ansible-tmp-1727096301.8625534-19935-46099222701206="` echo /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206 `" ) && sleep 0' 18911 1727096301.87360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096301.87428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096301.87444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096301.87462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096301.87585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096301.87640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096301.87750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096301.87986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096301.88051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096301.90071: stdout chunk (state=3): >>>ansible-tmp-1727096301.8625534-19935-46099222701206=/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206 <<< 18911 1727096301.90171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096301.90212: stderr chunk (state=3): >>><<< 18911 1727096301.90220: stdout chunk (state=3): >>><<< 18911 1727096301.90289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096301.8625534-19935-46099222701206=/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096301.90325: variable 'ansible_module_compression' from source: unknown 18911 1727096301.90577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096301.90623: variable 'ansible_facts' from source: unknown 18911 1727096301.90940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py 18911 1727096301.91514: Sending initial data 18911 1727096301.91524: Sent initial data (153 bytes) 18911 1727096301.92685: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096301.92822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096301.92919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096301.94577: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096301.94611: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096301.94745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096301.94810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp_9i5ehqt /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py <<< 18911 1727096301.94813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py" <<< 18911 1727096301.94900: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp_9i5ehqt" to remote "/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py" <<< 18911 1727096301.97758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096301.97965: stdout chunk (state=3): >>><<< 18911 1727096301.97972: stderr chunk (state=3): >>><<< 18911 1727096301.97974: done transferring module to remote 18911 1727096301.97976: _low_level_execute_command(): starting 18911 1727096301.97979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/ /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py && sleep 0' 18911 1727096301.99197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096301.99286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096301.99473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096301.99504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096301.99645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096302.01629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096302.01633: stdout chunk (state=3): >>><<< 18911 1727096302.01635: stderr chunk (state=3): >>><<< 18911 1727096302.01652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096302.01952: _low_level_execute_command(): starting 18911 1727096302.01957: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/AnsiballZ_setup.py && sleep 0' 18911 1727096302.02884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096302.03172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096302.03194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096302.03214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096302.03320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096302.72149: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "<<< 18911 1727096302.72255: stdout chunk (state=3): >>>SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "22", "epoch": "1727096302", "epoch_int": "1727096302", "date": "2024-09-23", "time": "08:58:22", "iso8601_micro": "2024-09-23T12:58:22.317232Z", "iso8601": "2024-09-23T12:58:22Z", "iso8601_basic": "20240923T085822317232", "iso8601_basic_short": "20240923T085822", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 455, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795368960, "block_size": 4096, "block_total": 65519099, "block_available": 63914885, "block_used": 1604214, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.57275390625, "5m": 0.37744140625, "15m": 0.1865234375}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "lsr27", "eth0", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096302.74476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096302.74494: stdout chunk (state=3): >>><<< 18911 1727096302.74508: stderr chunk (state=3): >>><<< 18911 1727096302.75078: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "22", "epoch": "1727096302", "epoch_int": "1727096302", "date": "2024-09-23", "time": "08:58:22", "iso8601_micro": "2024-09-23T12:58:22.317232Z", "iso8601": "2024-09-23T12:58:22Z", "iso8601_basic": "20240923T085822317232", "iso8601_basic_short": "20240923T085822", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 455, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795368960, "block_size": 4096, "block_total": 65519099, "block_available": 63914885, "block_used": 1604214, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.57275390625, "5m": 0.37744140625, "15m": 0.1865234375}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "lsr27", "eth0", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096302.75726: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096302.75766: _low_level_execute_command(): starting 18911 1727096302.75780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096301.8625534-19935-46099222701206/ > /dev/null 2>&1 && sleep 0' 18911 1727096302.76650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096302.76720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096302.76748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096302.76871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096302.76874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096302.76927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096302.77272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096302.79004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096302.79014: stdout chunk (state=3): >>><<< 18911 1727096302.79027: stderr chunk (state=3): >>><<< 18911 1727096302.79051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096302.79084: handler run complete 18911 1727096302.79453: variable 'ansible_facts' from source: unknown 18911 1727096302.79666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.80356: variable 'ansible_facts' from source: unknown 18911 1727096302.80487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.80636: attempt loop complete, returning result 18911 1727096302.80649: _execute() done 18911 1727096302.80657: dumping result to json 18911 1727096302.80700: done dumping result, returning 18911 1727096302.80720: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-000000000316] 18911 1727096302.80730: sending task result for task 0afff68d-5257-09a7-aae1-000000000316 18911 1727096302.81576: done sending task result for task 0afff68d-5257-09a7-aae1-000000000316 18911 1727096302.81579: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096302.81914: no more pending results, returning what we have 18911 1727096302.81917: results queue empty 18911 1727096302.81918: checking for any_errors_fatal 18911 1727096302.81920: done checking for any_errors_fatal 18911 1727096302.81920: checking for max_fail_percentage 18911 1727096302.81922: done checking for max_fail_percentage 18911 1727096302.81923: checking to see if all hosts have failed and the running result is not ok 18911 1727096302.81923: done checking to see if all hosts have failed 18911 1727096302.81924: getting the remaining hosts for this loop 18911 1727096302.81925: done getting the remaining hosts for this loop 18911 1727096302.81929: getting the next task for host managed_node1 18911 1727096302.81934: done getting next task for host managed_node1 18911 1727096302.81936: ^ task is: TASK: meta (flush_handlers) 18911 1727096302.81938: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096302.81941: getting variables 18911 1727096302.81942: in VariableManager get_vars() 18911 1727096302.81965: Calling all_inventory to load vars for managed_node1 18911 1727096302.82045: Calling groups_inventory to load vars for managed_node1 18911 1727096302.82049: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096302.82059: Calling all_plugins_play to load vars for managed_node1 18911 1727096302.82062: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096302.82065: Calling groups_plugins_play to load vars for managed_node1 18911 1727096302.83716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.86183: done with get_vars() 18911 1727096302.86207: done getting variables 18911 1727096302.86278: in VariableManager get_vars() 18911 1727096302.86289: Calling all_inventory to load vars for managed_node1 18911 1727096302.86291: Calling groups_inventory to load vars for managed_node1 18911 1727096302.86294: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096302.86299: Calling all_plugins_play to load vars for managed_node1 18911 1727096302.86301: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096302.86304: Calling groups_plugins_play to load vars for managed_node1 18911 1727096302.87478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.89059: done with get_vars() 18911 1727096302.89102: done queuing things up, now waiting for results queue to drain 18911 1727096302.89111: results queue empty 18911 1727096302.89112: checking for any_errors_fatal 18911 1727096302.89117: done checking for any_errors_fatal 18911 1727096302.89118: checking for max_fail_percentage 18911 1727096302.89119: done checking for max_fail_percentage 18911 1727096302.89125: checking to see if all hosts have failed and the running result is not ok 18911 1727096302.89126: done checking to see if all hosts have failed 18911 1727096302.89127: getting the remaining hosts for this loop 18911 1727096302.89128: done getting the remaining hosts for this loop 18911 1727096302.89131: getting the next task for host managed_node1 18911 1727096302.89135: done getting next task for host managed_node1 18911 1727096302.89137: ^ task is: TASK: Show network_provider 18911 1727096302.89139: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096302.89141: getting variables 18911 1727096302.89142: in VariableManager get_vars() 18911 1727096302.89160: Calling all_inventory to load vars for managed_node1 18911 1727096302.89162: Calling groups_inventory to load vars for managed_node1 18911 1727096302.89165: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096302.89174: Calling all_plugins_play to load vars for managed_node1 18911 1727096302.89176: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096302.89179: Calling groups_plugins_play to load vars for managed_node1 18911 1727096302.90561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.92083: done with get_vars() 18911 1727096302.92110: done getting variables 18911 1727096302.92156: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Monday 23 September 2024 08:58:22 -0400 (0:00:01.111) 0:00:22.036 ****** 18911 1727096302.92186: entering _queue_task() for managed_node1/debug 18911 1727096302.92531: worker is 1 (out of 1 available) 18911 1727096302.92542: exiting _queue_task() for managed_node1/debug 18911 1727096302.92554: done queuing things up, now waiting for results queue to drain 18911 1727096302.92555: waiting for pending results... 18911 1727096302.92822: running TaskExecutor() for managed_node1/TASK: Show network_provider 18911 1727096302.92923: in run() - task 0afff68d-5257-09a7-aae1-000000000033 18911 1727096302.92943: variable 'ansible_search_path' from source: unknown 18911 1727096302.92992: calling self._execute() 18911 1727096302.93099: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096302.93103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096302.93172: variable 'omit' from source: magic vars 18911 1727096302.93501: variable 'ansible_distribution_major_version' from source: facts 18911 1727096302.93518: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096302.93534: variable 'omit' from source: magic vars 18911 1727096302.93570: variable 'omit' from source: magic vars 18911 1727096302.93612: variable 'omit' from source: magic vars 18911 1727096302.93660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096302.93704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096302.93731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096302.93757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096302.93774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096302.93808: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096302.93972: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096302.93975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096302.93978: Set connection var ansible_shell_executable to /bin/sh 18911 1727096302.93981: Set connection var ansible_timeout to 10 18911 1727096302.93983: Set connection var ansible_shell_type to sh 18911 1727096302.93985: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096302.93987: Set connection var ansible_pipelining to False 18911 1727096302.93990: Set connection var ansible_connection to ssh 18911 1727096302.93992: variable 'ansible_shell_executable' from source: unknown 18911 1727096302.93998: variable 'ansible_connection' from source: unknown 18911 1727096302.94006: variable 'ansible_module_compression' from source: unknown 18911 1727096302.94013: variable 'ansible_shell_type' from source: unknown 18911 1727096302.94021: variable 'ansible_shell_executable' from source: unknown 18911 1727096302.94028: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096302.94036: variable 'ansible_pipelining' from source: unknown 18911 1727096302.94043: variable 'ansible_timeout' from source: unknown 18911 1727096302.94051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096302.94192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096302.94211: variable 'omit' from source: magic vars 18911 1727096302.94226: starting attempt loop 18911 1727096302.94235: running the handler 18911 1727096302.94289: variable 'network_provider' from source: set_fact 18911 1727096302.94380: variable 'network_provider' from source: set_fact 18911 1727096302.94397: handler run complete 18911 1727096302.94420: attempt loop complete, returning result 18911 1727096302.94427: _execute() done 18911 1727096302.94438: dumping result to json 18911 1727096302.94445: done dumping result, returning 18911 1727096302.94547: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0afff68d-5257-09a7-aae1-000000000033] 18911 1727096302.94550: sending task result for task 0afff68d-5257-09a7-aae1-000000000033 18911 1727096302.94622: done sending task result for task 0afff68d-5257-09a7-aae1-000000000033 18911 1727096302.94625: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 18911 1727096302.94699: no more pending results, returning what we have 18911 1727096302.94703: results queue empty 18911 1727096302.94704: checking for any_errors_fatal 18911 1727096302.94706: done checking for any_errors_fatal 18911 1727096302.94706: checking for max_fail_percentage 18911 1727096302.94708: done checking for max_fail_percentage 18911 1727096302.94710: checking to see if all hosts have failed and the running result is not ok 18911 1727096302.94710: done checking to see if all hosts have failed 18911 1727096302.94711: getting the remaining hosts for this loop 18911 1727096302.94712: done getting the remaining hosts for this loop 18911 1727096302.94716: getting the next task for host managed_node1 18911 1727096302.94725: done getting next task for host managed_node1 18911 1727096302.94727: ^ task is: TASK: meta (flush_handlers) 18911 1727096302.94729: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096302.94735: getting variables 18911 1727096302.94737: in VariableManager get_vars() 18911 1727096302.94768: Calling all_inventory to load vars for managed_node1 18911 1727096302.94772: Calling groups_inventory to load vars for managed_node1 18911 1727096302.94776: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096302.94788: Calling all_plugins_play to load vars for managed_node1 18911 1727096302.94791: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096302.94795: Calling groups_plugins_play to load vars for managed_node1 18911 1727096302.96372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096302.97964: done with get_vars() 18911 1727096302.97995: done getting variables 18911 1727096302.98060: in VariableManager get_vars() 18911 1727096302.98071: Calling all_inventory to load vars for managed_node1 18911 1727096302.98074: Calling groups_inventory to load vars for managed_node1 18911 1727096302.98076: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096302.98080: Calling all_plugins_play to load vars for managed_node1 18911 1727096302.98082: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096302.98084: Calling groups_plugins_play to load vars for managed_node1 18911 1727096302.99406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096303.01071: done with get_vars() 18911 1727096303.01106: done queuing things up, now waiting for results queue to drain 18911 1727096303.01108: results queue empty 18911 1727096303.01109: checking for any_errors_fatal 18911 1727096303.01112: done checking for any_errors_fatal 18911 1727096303.01113: checking for max_fail_percentage 18911 1727096303.01114: done checking for max_fail_percentage 18911 1727096303.01114: checking to see if all hosts have failed and the running result is not ok 18911 1727096303.01115: done checking to see if all hosts have failed 18911 1727096303.01116: getting the remaining hosts for this loop 18911 1727096303.01117: done getting the remaining hosts for this loop 18911 1727096303.01119: getting the next task for host managed_node1 18911 1727096303.01129: done getting next task for host managed_node1 18911 1727096303.01131: ^ task is: TASK: meta (flush_handlers) 18911 1727096303.01132: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096303.01135: getting variables 18911 1727096303.01136: in VariableManager get_vars() 18911 1727096303.01145: Calling all_inventory to load vars for managed_node1 18911 1727096303.01147: Calling groups_inventory to load vars for managed_node1 18911 1727096303.01150: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096303.01155: Calling all_plugins_play to load vars for managed_node1 18911 1727096303.01158: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096303.01160: Calling groups_plugins_play to load vars for managed_node1 18911 1727096303.02317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096303.03935: done with get_vars() 18911 1727096303.03959: done getting variables 18911 1727096303.04017: in VariableManager get_vars() 18911 1727096303.04027: Calling all_inventory to load vars for managed_node1 18911 1727096303.04029: Calling groups_inventory to load vars for managed_node1 18911 1727096303.04032: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096303.04037: Calling all_plugins_play to load vars for managed_node1 18911 1727096303.04039: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096303.04042: Calling groups_plugins_play to load vars for managed_node1 18911 1727096303.05239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096303.06896: done with get_vars() 18911 1727096303.06933: done queuing things up, now waiting for results queue to drain 18911 1727096303.06935: results queue empty 18911 1727096303.06936: checking for any_errors_fatal 18911 1727096303.06937: done checking for any_errors_fatal 18911 1727096303.06938: checking for max_fail_percentage 18911 1727096303.06939: done checking for max_fail_percentage 18911 1727096303.06940: checking to see if all hosts have failed and the running result is not ok 18911 1727096303.06941: done checking to see if all hosts have failed 18911 1727096303.06941: getting the remaining hosts for this loop 18911 1727096303.06942: done getting the remaining hosts for this loop 18911 1727096303.06945: getting the next task for host managed_node1 18911 1727096303.06948: done getting next task for host managed_node1 18911 1727096303.06949: ^ task is: None 18911 1727096303.06951: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096303.06952: done queuing things up, now waiting for results queue to drain 18911 1727096303.06953: results queue empty 18911 1727096303.06953: checking for any_errors_fatal 18911 1727096303.06954: done checking for any_errors_fatal 18911 1727096303.06955: checking for max_fail_percentage 18911 1727096303.06956: done checking for max_fail_percentage 18911 1727096303.06956: checking to see if all hosts have failed and the running result is not ok 18911 1727096303.06957: done checking to see if all hosts have failed 18911 1727096303.06958: getting the next task for host managed_node1 18911 1727096303.06960: done getting next task for host managed_node1 18911 1727096303.06961: ^ task is: None 18911 1727096303.06962: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096303.07003: in VariableManager get_vars() 18911 1727096303.07026: done with get_vars() 18911 1727096303.07032: in VariableManager get_vars() 18911 1727096303.07045: done with get_vars() 18911 1727096303.07050: variable 'omit' from source: magic vars 18911 1727096303.07174: variable 'profile' from source: play vars 18911 1727096303.07290: in VariableManager get_vars() 18911 1727096303.07305: done with get_vars() 18911 1727096303.07326: variable 'omit' from source: magic vars 18911 1727096303.07392: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18911 1727096303.08081: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096303.08221: getting the remaining hosts for this loop 18911 1727096303.08223: done getting the remaining hosts for this loop 18911 1727096303.08226: getting the next task for host managed_node1 18911 1727096303.08228: done getting next task for host managed_node1 18911 1727096303.08230: ^ task is: TASK: Gathering Facts 18911 1727096303.08232: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096303.08234: getting variables 18911 1727096303.08235: in VariableManager get_vars() 18911 1727096303.08247: Calling all_inventory to load vars for managed_node1 18911 1727096303.08250: Calling groups_inventory to load vars for managed_node1 18911 1727096303.08252: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096303.08257: Calling all_plugins_play to load vars for managed_node1 18911 1727096303.08260: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096303.08262: Calling groups_plugins_play to load vars for managed_node1 18911 1727096303.09576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096303.11644: done with get_vars() 18911 1727096303.11682: done getting variables 18911 1727096303.11739: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Monday 23 September 2024 08:58:23 -0400 (0:00:00.195) 0:00:22.232 ****** 18911 1727096303.11768: entering _queue_task() for managed_node1/gather_facts 18911 1727096303.12153: worker is 1 (out of 1 available) 18911 1727096303.12178: exiting _queue_task() for managed_node1/gather_facts 18911 1727096303.12190: done queuing things up, now waiting for results queue to drain 18911 1727096303.12191: waiting for pending results... 18911 1727096303.12591: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096303.12874: in run() - task 0afff68d-5257-09a7-aae1-00000000032b 18911 1727096303.12877: variable 'ansible_search_path' from source: unknown 18911 1727096303.12882: calling self._execute() 18911 1727096303.12885: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096303.12887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096303.12889: variable 'omit' from source: magic vars 18911 1727096303.13416: variable 'ansible_distribution_major_version' from source: facts 18911 1727096303.13434: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096303.13449: variable 'omit' from source: magic vars 18911 1727096303.13489: variable 'omit' from source: magic vars 18911 1727096303.13529: variable 'omit' from source: magic vars 18911 1727096303.13659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096303.13665: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096303.13670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096303.13674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096303.13690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096303.13725: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096303.13734: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096303.13741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096303.13851: Set connection var ansible_shell_executable to /bin/sh 18911 1727096303.13865: Set connection var ansible_timeout to 10 18911 1727096303.13877: Set connection var ansible_shell_type to sh 18911 1727096303.13889: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096303.13898: Set connection var ansible_pipelining to False 18911 1727096303.13906: Set connection var ansible_connection to ssh 18911 1727096303.13932: variable 'ansible_shell_executable' from source: unknown 18911 1727096303.13939: variable 'ansible_connection' from source: unknown 18911 1727096303.13945: variable 'ansible_module_compression' from source: unknown 18911 1727096303.13982: variable 'ansible_shell_type' from source: unknown 18911 1727096303.13985: variable 'ansible_shell_executable' from source: unknown 18911 1727096303.13987: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096303.13990: variable 'ansible_pipelining' from source: unknown 18911 1727096303.13992: variable 'ansible_timeout' from source: unknown 18911 1727096303.13994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096303.14169: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096303.14198: variable 'omit' from source: magic vars 18911 1727096303.14201: starting attempt loop 18911 1727096303.14204: running the handler 18911 1727096303.14273: variable 'ansible_facts' from source: unknown 18911 1727096303.14276: _low_level_execute_command(): starting 18911 1727096303.14278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096303.14986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096303.15000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096303.15015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096303.15033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096303.15051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096303.15082: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096303.15180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096303.15192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096303.15229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096303.15302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096303.17086: stdout chunk (state=3): >>>/root <<< 18911 1727096303.17548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096303.17553: stdout chunk (state=3): >>><<< 18911 1727096303.17555: stderr chunk (state=3): >>><<< 18911 1727096303.17558: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096303.17560: _low_level_execute_command(): starting 18911 1727096303.17565: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951 `" && echo ansible-tmp-1727096303.174565-19986-191752536468951="` echo /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951 `" ) && sleep 0' 18911 1727096303.18600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096303.18604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096303.18625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096303.18793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096303.18805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096303.18901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096303.20917: stdout chunk (state=3): >>>ansible-tmp-1727096303.174565-19986-191752536468951=/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951 <<< 18911 1727096303.21092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096303.21374: stderr chunk (state=3): >>><<< 18911 1727096303.21377: stdout chunk (state=3): >>><<< 18911 1727096303.21380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096303.174565-19986-191752536468951=/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096303.21384: variable 'ansible_module_compression' from source: unknown 18911 1727096303.21387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096303.22173: variable 'ansible_facts' from source: unknown 18911 1727096303.22325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py 18911 1727096303.22704: Sending initial data 18911 1727096303.22713: Sent initial data (153 bytes) 18911 1727096303.24085: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096303.24191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096303.24291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096303.25936: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096303.25998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096303.26154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmphyv38_r0 /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py <<< 18911 1727096303.26386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmphyv38_r0" to remote "/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py" <<< 18911 1727096303.28976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096303.29057: stderr chunk (state=3): >>><<< 18911 1727096303.29072: stdout chunk (state=3): >>><<< 18911 1727096303.29102: done transferring module to remote 18911 1727096303.29117: _low_level_execute_command(): starting 18911 1727096303.29127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/ /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py && sleep 0' 18911 1727096303.30117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096303.30133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096303.30147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096303.30172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096303.30192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096303.30210: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096303.30283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096303.30336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096303.30376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096303.30390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096303.30568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096303.32490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096303.32526: stderr chunk (state=3): >>><<< 18911 1727096303.32656: stdout chunk (state=3): >>><<< 18911 1727096303.32679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096303.32682: _low_level_execute_command(): starting 18911 1727096303.32688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/AnsiballZ_setup.py && sleep 0' 18911 1727096303.33786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096303.33791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096303.33794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096303.33796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096303.33848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096303.33883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096303.33955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.00132: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ans<<< 18911 1727096304.00177: stdout chunk (state=3): >>>ible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 456, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795368960, "block_size": 4096, "block_total": 65519099, "block_available": 63914885, "block_used": 1604214, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "23", "epoch": "1727096303", "epoch_int": "1727096303", "date": "2024-09-23", "time": "08:58:23", "iso8601_micro": "2024-09-23T12:58:23.932874Z", "iso8601": "2024-09-23T12:58:23Z", "iso8601_basic": "20240923T085823932874", "iso8601_basic_short": "20240923T085823", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "peerlsr27", "eth0", "lsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.57275390625, "5m": 0.37744140625, "15m": 0.1865234375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096304.02267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096304.02305: stderr chunk (state=3): >>><<< 18911 1727096304.02309: stdout chunk (state=3): >>><<< 18911 1727096304.02355: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 456, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795368960, "block_size": 4096, "block_total": 65519099, "block_available": 63914885, "block_used": 1604214, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "23", "epoch": "1727096303", "epoch_int": "1727096303", "date": "2024-09-23", "time": "08:58:23", "iso8601_micro": "2024-09-23T12:58:23.932874Z", "iso8601": "2024-09-23T12:58:23Z", "iso8601_basic": "20240923T085823932874", "iso8601_basic_short": "20240923T085823", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "peerlsr27", "eth0", "lsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.57275390625, "5m": 0.37744140625, "15m": 0.1865234375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096304.02663: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096304.02685: _low_level_execute_command(): starting 18911 1727096304.02689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096303.174565-19986-191752536468951/ > /dev/null 2>&1 && sleep 0' 18911 1727096304.03122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096304.03128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.03130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096304.03132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096304.03134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.03184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.03188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.03261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.05383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096304.05387: stdout chunk (state=3): >>><<< 18911 1727096304.05389: stderr chunk (state=3): >>><<< 18911 1727096304.05422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096304.05574: handler run complete 18911 1727096304.05679: variable 'ansible_facts' from source: unknown 18911 1727096304.05840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.06177: variable 'ansible_facts' from source: unknown 18911 1727096304.06252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.06351: attempt loop complete, returning result 18911 1727096304.06358: _execute() done 18911 1727096304.06368: dumping result to json 18911 1727096304.06393: done dumping result, returning 18911 1727096304.06410: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-00000000032b] 18911 1727096304.06419: sending task result for task 0afff68d-5257-09a7-aae1-00000000032b ok: [managed_node1] 18911 1727096304.07777: done sending task result for task 0afff68d-5257-09a7-aae1-00000000032b 18911 1727096304.07781: WORKER PROCESS EXITING 18911 1727096304.07802: no more pending results, returning what we have 18911 1727096304.07805: results queue empty 18911 1727096304.07806: checking for any_errors_fatal 18911 1727096304.07807: done checking for any_errors_fatal 18911 1727096304.07808: checking for max_fail_percentage 18911 1727096304.07810: done checking for max_fail_percentage 18911 1727096304.07811: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.07811: done checking to see if all hosts have failed 18911 1727096304.07812: getting the remaining hosts for this loop 18911 1727096304.07813: done getting the remaining hosts for this loop 18911 1727096304.07817: getting the next task for host managed_node1 18911 1727096304.07821: done getting next task for host managed_node1 18911 1727096304.07823: ^ task is: TASK: meta (flush_handlers) 18911 1727096304.07825: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.07828: getting variables 18911 1727096304.07830: in VariableManager get_vars() 18911 1727096304.07857: Calling all_inventory to load vars for managed_node1 18911 1727096304.07860: Calling groups_inventory to load vars for managed_node1 18911 1727096304.07865: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.07877: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.07880: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.07884: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.09284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.10228: done with get_vars() 18911 1727096304.10248: done getting variables 18911 1727096304.10305: in VariableManager get_vars() 18911 1727096304.10314: Calling all_inventory to load vars for managed_node1 18911 1727096304.10316: Calling groups_inventory to load vars for managed_node1 18911 1727096304.10317: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.10321: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.10322: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.10324: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.11451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.12948: done with get_vars() 18911 1727096304.12987: done queuing things up, now waiting for results queue to drain 18911 1727096304.12989: results queue empty 18911 1727096304.12990: checking for any_errors_fatal 18911 1727096304.12995: done checking for any_errors_fatal 18911 1727096304.12996: checking for max_fail_percentage 18911 1727096304.13001: done checking for max_fail_percentage 18911 1727096304.13002: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.13003: done checking to see if all hosts have failed 18911 1727096304.13003: getting the remaining hosts for this loop 18911 1727096304.13005: done getting the remaining hosts for this loop 18911 1727096304.13007: getting the next task for host managed_node1 18911 1727096304.13011: done getting next task for host managed_node1 18911 1727096304.13015: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096304.13016: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.13026: getting variables 18911 1727096304.13027: in VariableManager get_vars() 18911 1727096304.13042: Calling all_inventory to load vars for managed_node1 18911 1727096304.13045: Calling groups_inventory to load vars for managed_node1 18911 1727096304.13047: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.13052: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.13055: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.13057: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.14260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.15828: done with get_vars() 18911 1727096304.15851: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:58:24 -0400 (0:00:01.041) 0:00:23.273 ****** 18911 1727096304.15931: entering _queue_task() for managed_node1/include_tasks 18911 1727096304.16349: worker is 1 (out of 1 available) 18911 1727096304.16363: exiting _queue_task() for managed_node1/include_tasks 18911 1727096304.16377: done queuing things up, now waiting for results queue to drain 18911 1727096304.16378: waiting for pending results... 18911 1727096304.16603: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096304.16683: in run() - task 0afff68d-5257-09a7-aae1-00000000003c 18911 1727096304.16702: variable 'ansible_search_path' from source: unknown 18911 1727096304.16706: variable 'ansible_search_path' from source: unknown 18911 1727096304.16732: calling self._execute() 18911 1727096304.16807: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.16812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.16821: variable 'omit' from source: magic vars 18911 1727096304.17101: variable 'ansible_distribution_major_version' from source: facts 18911 1727096304.17110: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096304.17116: _execute() done 18911 1727096304.17120: dumping result to json 18911 1727096304.17123: done dumping result, returning 18911 1727096304.17133: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-09a7-aae1-00000000003c] 18911 1727096304.17135: sending task result for task 0afff68d-5257-09a7-aae1-00000000003c 18911 1727096304.17225: done sending task result for task 0afff68d-5257-09a7-aae1-00000000003c 18911 1727096304.17228: WORKER PROCESS EXITING 18911 1727096304.17299: no more pending results, returning what we have 18911 1727096304.17303: in VariableManager get_vars() 18911 1727096304.17349: Calling all_inventory to load vars for managed_node1 18911 1727096304.17353: Calling groups_inventory to load vars for managed_node1 18911 1727096304.17355: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.17367: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.17370: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.17373: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.18181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.19381: done with get_vars() 18911 1727096304.19402: variable 'ansible_search_path' from source: unknown 18911 1727096304.19403: variable 'ansible_search_path' from source: unknown 18911 1727096304.19432: we have included files to process 18911 1727096304.19433: generating all_blocks data 18911 1727096304.19434: done generating all_blocks data 18911 1727096304.19435: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096304.19436: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096304.19439: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096304.19966: done processing included file 18911 1727096304.19969: iterating over new_blocks loaded from include file 18911 1727096304.19970: in VariableManager get_vars() 18911 1727096304.19984: done with get_vars() 18911 1727096304.19985: filtering new block on tags 18911 1727096304.20002: done filtering new block on tags 18911 1727096304.20005: in VariableManager get_vars() 18911 1727096304.20017: done with get_vars() 18911 1727096304.20018: filtering new block on tags 18911 1727096304.20033: done filtering new block on tags 18911 1727096304.20035: in VariableManager get_vars() 18911 1727096304.20047: done with get_vars() 18911 1727096304.20048: filtering new block on tags 18911 1727096304.20057: done filtering new block on tags 18911 1727096304.20058: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18911 1727096304.20064: extending task lists for all hosts with included blocks 18911 1727096304.20279: done extending task lists 18911 1727096304.20280: done processing included files 18911 1727096304.20281: results queue empty 18911 1727096304.20281: checking for any_errors_fatal 18911 1727096304.20282: done checking for any_errors_fatal 18911 1727096304.20282: checking for max_fail_percentage 18911 1727096304.20283: done checking for max_fail_percentage 18911 1727096304.20283: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.20284: done checking to see if all hosts have failed 18911 1727096304.20285: getting the remaining hosts for this loop 18911 1727096304.20285: done getting the remaining hosts for this loop 18911 1727096304.20287: getting the next task for host managed_node1 18911 1727096304.20289: done getting next task for host managed_node1 18911 1727096304.20291: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096304.20292: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.20298: getting variables 18911 1727096304.20299: in VariableManager get_vars() 18911 1727096304.20308: Calling all_inventory to load vars for managed_node1 18911 1727096304.20310: Calling groups_inventory to load vars for managed_node1 18911 1727096304.20311: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.20315: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.20316: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.20318: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.24493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.26094: done with get_vars() 18911 1727096304.26122: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:58:24 -0400 (0:00:00.102) 0:00:23.376 ****** 18911 1727096304.26203: entering _queue_task() for managed_node1/setup 18911 1727096304.26569: worker is 1 (out of 1 available) 18911 1727096304.26582: exiting _queue_task() for managed_node1/setup 18911 1727096304.26593: done queuing things up, now waiting for results queue to drain 18911 1727096304.26594: waiting for pending results... 18911 1727096304.26990: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096304.27045: in run() - task 0afff68d-5257-09a7-aae1-00000000036c 18911 1727096304.27066: variable 'ansible_search_path' from source: unknown 18911 1727096304.27076: variable 'ansible_search_path' from source: unknown 18911 1727096304.27119: calling self._execute() 18911 1727096304.27223: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.27235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.27251: variable 'omit' from source: magic vars 18911 1727096304.27638: variable 'ansible_distribution_major_version' from source: facts 18911 1727096304.27654: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096304.27879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096304.30110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096304.30197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096304.30244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096304.30289: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096304.30320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096304.30411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096304.30466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096304.30486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096304.30573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096304.30576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096304.30611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096304.30641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096304.30678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096304.30724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096304.30744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096304.31012: variable '__network_required_facts' from source: role '' defaults 18911 1727096304.31015: variable 'ansible_facts' from source: unknown 18911 1727096304.31740: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18911 1727096304.31750: when evaluation is False, skipping this task 18911 1727096304.31757: _execute() done 18911 1727096304.31770: dumping result to json 18911 1727096304.31780: done dumping result, returning 18911 1727096304.31793: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-09a7-aae1-00000000036c] 18911 1727096304.31803: sending task result for task 0afff68d-5257-09a7-aae1-00000000036c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096304.32041: no more pending results, returning what we have 18911 1727096304.32045: results queue empty 18911 1727096304.32046: checking for any_errors_fatal 18911 1727096304.32048: done checking for any_errors_fatal 18911 1727096304.32049: checking for max_fail_percentage 18911 1727096304.32050: done checking for max_fail_percentage 18911 1727096304.32051: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.32052: done checking to see if all hosts have failed 18911 1727096304.32053: getting the remaining hosts for this loop 18911 1727096304.32054: done getting the remaining hosts for this loop 18911 1727096304.32058: getting the next task for host managed_node1 18911 1727096304.32072: done getting next task for host managed_node1 18911 1727096304.32077: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096304.32079: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.32093: getting variables 18911 1727096304.32095: in VariableManager get_vars() 18911 1727096304.32135: Calling all_inventory to load vars for managed_node1 18911 1727096304.32138: Calling groups_inventory to load vars for managed_node1 18911 1727096304.32141: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.32152: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.32155: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.32158: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.32172: done sending task result for task 0afff68d-5257-09a7-aae1-00000000036c 18911 1727096304.32176: WORKER PROCESS EXITING 18911 1727096304.33805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.35439: done with get_vars() 18911 1727096304.35471: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:58:24 -0400 (0:00:00.093) 0:00:23.470 ****** 18911 1727096304.35572: entering _queue_task() for managed_node1/stat 18911 1727096304.35934: worker is 1 (out of 1 available) 18911 1727096304.35949: exiting _queue_task() for managed_node1/stat 18911 1727096304.35960: done queuing things up, now waiting for results queue to drain 18911 1727096304.35961: waiting for pending results... 18911 1727096304.36252: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096304.36417: in run() - task 0afff68d-5257-09a7-aae1-00000000036e 18911 1727096304.36438: variable 'ansible_search_path' from source: unknown 18911 1727096304.36445: variable 'ansible_search_path' from source: unknown 18911 1727096304.36497: calling self._execute() 18911 1727096304.36599: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.36616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.36632: variable 'omit' from source: magic vars 18911 1727096304.37026: variable 'ansible_distribution_major_version' from source: facts 18911 1727096304.37048: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096304.37220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096304.37433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096304.37473: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096304.37528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096304.37562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096304.37625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096304.37644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096304.37664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096304.37687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096304.37755: variable '__network_is_ostree' from source: set_fact 18911 1727096304.37759: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096304.37763: when evaluation is False, skipping this task 18911 1727096304.37770: _execute() done 18911 1727096304.37773: dumping result to json 18911 1727096304.37777: done dumping result, returning 18911 1727096304.37786: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-09a7-aae1-00000000036e] 18911 1727096304.37788: sending task result for task 0afff68d-5257-09a7-aae1-00000000036e 18911 1727096304.37875: done sending task result for task 0afff68d-5257-09a7-aae1-00000000036e 18911 1727096304.37877: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096304.37957: no more pending results, returning what we have 18911 1727096304.37960: results queue empty 18911 1727096304.37961: checking for any_errors_fatal 18911 1727096304.37968: done checking for any_errors_fatal 18911 1727096304.37969: checking for max_fail_percentage 18911 1727096304.37971: done checking for max_fail_percentage 18911 1727096304.37972: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.37973: done checking to see if all hosts have failed 18911 1727096304.37974: getting the remaining hosts for this loop 18911 1727096304.37975: done getting the remaining hosts for this loop 18911 1727096304.37982: getting the next task for host managed_node1 18911 1727096304.37988: done getting next task for host managed_node1 18911 1727096304.37992: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096304.37996: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.38009: getting variables 18911 1727096304.38010: in VariableManager get_vars() 18911 1727096304.38043: Calling all_inventory to load vars for managed_node1 18911 1727096304.38046: Calling groups_inventory to load vars for managed_node1 18911 1727096304.38047: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.38057: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.38059: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.38061: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.39003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.40201: done with get_vars() 18911 1727096304.40220: done getting variables 18911 1727096304.40283: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:58:24 -0400 (0:00:00.047) 0:00:23.517 ****** 18911 1727096304.40315: entering _queue_task() for managed_node1/set_fact 18911 1727096304.40654: worker is 1 (out of 1 available) 18911 1727096304.40666: exiting _queue_task() for managed_node1/set_fact 18911 1727096304.40880: done queuing things up, now waiting for results queue to drain 18911 1727096304.40881: waiting for pending results... 18911 1727096304.41008: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096304.41102: in run() - task 0afff68d-5257-09a7-aae1-00000000036f 18911 1727096304.41123: variable 'ansible_search_path' from source: unknown 18911 1727096304.41130: variable 'ansible_search_path' from source: unknown 18911 1727096304.41171: calling self._execute() 18911 1727096304.41279: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.41327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.41330: variable 'omit' from source: magic vars 18911 1727096304.41670: variable 'ansible_distribution_major_version' from source: facts 18911 1727096304.41678: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096304.41806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096304.42002: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096304.42035: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096304.42064: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096304.42122: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096304.42187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096304.42207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096304.42225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096304.42243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096304.42318: variable '__network_is_ostree' from source: set_fact 18911 1727096304.42325: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096304.42328: when evaluation is False, skipping this task 18911 1727096304.42330: _execute() done 18911 1727096304.42333: dumping result to json 18911 1727096304.42335: done dumping result, returning 18911 1727096304.42343: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-09a7-aae1-00000000036f] 18911 1727096304.42346: sending task result for task 0afff68d-5257-09a7-aae1-00000000036f 18911 1727096304.42432: done sending task result for task 0afff68d-5257-09a7-aae1-00000000036f 18911 1727096304.42435: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096304.42492: no more pending results, returning what we have 18911 1727096304.42495: results queue empty 18911 1727096304.42496: checking for any_errors_fatal 18911 1727096304.42504: done checking for any_errors_fatal 18911 1727096304.42504: checking for max_fail_percentage 18911 1727096304.42506: done checking for max_fail_percentage 18911 1727096304.42507: checking to see if all hosts have failed and the running result is not ok 18911 1727096304.42508: done checking to see if all hosts have failed 18911 1727096304.42508: getting the remaining hosts for this loop 18911 1727096304.42510: done getting the remaining hosts for this loop 18911 1727096304.42513: getting the next task for host managed_node1 18911 1727096304.42522: done getting next task for host managed_node1 18911 1727096304.42526: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096304.42528: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096304.42541: getting variables 18911 1727096304.42543: in VariableManager get_vars() 18911 1727096304.42585: Calling all_inventory to load vars for managed_node1 18911 1727096304.42588: Calling groups_inventory to load vars for managed_node1 18911 1727096304.42591: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096304.42600: Calling all_plugins_play to load vars for managed_node1 18911 1727096304.42603: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096304.42605: Calling groups_plugins_play to load vars for managed_node1 18911 1727096304.43429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096304.44649: done with get_vars() 18911 1727096304.44673: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:58:24 -0400 (0:00:00.044) 0:00:23.562 ****** 18911 1727096304.44764: entering _queue_task() for managed_node1/service_facts 18911 1727096304.45110: worker is 1 (out of 1 available) 18911 1727096304.45123: exiting _queue_task() for managed_node1/service_facts 18911 1727096304.45141: done queuing things up, now waiting for results queue to drain 18911 1727096304.45142: waiting for pending results... 18911 1727096304.45388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096304.45478: in run() - task 0afff68d-5257-09a7-aae1-000000000371 18911 1727096304.45486: variable 'ansible_search_path' from source: unknown 18911 1727096304.45489: variable 'ansible_search_path' from source: unknown 18911 1727096304.45513: calling self._execute() 18911 1727096304.45596: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.45600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.45603: variable 'omit' from source: magic vars 18911 1727096304.45881: variable 'ansible_distribution_major_version' from source: facts 18911 1727096304.45892: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096304.45897: variable 'omit' from source: magic vars 18911 1727096304.45936: variable 'omit' from source: magic vars 18911 1727096304.45964: variable 'omit' from source: magic vars 18911 1727096304.45997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096304.46025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096304.46043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096304.46058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096304.46069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096304.46093: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096304.46096: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.46101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.46175: Set connection var ansible_shell_executable to /bin/sh 18911 1727096304.46179: Set connection var ansible_timeout to 10 18911 1727096304.46181: Set connection var ansible_shell_type to sh 18911 1727096304.46188: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096304.46193: Set connection var ansible_pipelining to False 18911 1727096304.46198: Set connection var ansible_connection to ssh 18911 1727096304.46216: variable 'ansible_shell_executable' from source: unknown 18911 1727096304.46219: variable 'ansible_connection' from source: unknown 18911 1727096304.46222: variable 'ansible_module_compression' from source: unknown 18911 1727096304.46224: variable 'ansible_shell_type' from source: unknown 18911 1727096304.46228: variable 'ansible_shell_executable' from source: unknown 18911 1727096304.46231: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096304.46234: variable 'ansible_pipelining' from source: unknown 18911 1727096304.46236: variable 'ansible_timeout' from source: unknown 18911 1727096304.46239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096304.46385: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096304.46393: variable 'omit' from source: magic vars 18911 1727096304.46398: starting attempt loop 18911 1727096304.46401: running the handler 18911 1727096304.46412: _low_level_execute_command(): starting 18911 1727096304.46419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096304.46951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096304.46955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.46959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096304.46961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.47021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.47026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096304.47029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.47100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.48837: stdout chunk (state=3): >>>/root <<< 18911 1727096304.48931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096304.48966: stderr chunk (state=3): >>><<< 18911 1727096304.48971: stdout chunk (state=3): >>><<< 18911 1727096304.48992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096304.49006: _low_level_execute_command(): starting 18911 1727096304.49011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131 `" && echo ansible-tmp-1727096304.489918-20053-30509498884131="` echo /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131 `" ) && sleep 0' 18911 1727096304.49566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096304.49571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.49580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096304.49583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.49630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.49636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096304.49641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.49721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.51742: stdout chunk (state=3): >>>ansible-tmp-1727096304.489918-20053-30509498884131=/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131 <<< 18911 1727096304.51848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096304.51883: stderr chunk (state=3): >>><<< 18911 1727096304.51886: stdout chunk (state=3): >>><<< 18911 1727096304.51902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096304.489918-20053-30509498884131=/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096304.51945: variable 'ansible_module_compression' from source: unknown 18911 1727096304.52173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18911 1727096304.52177: variable 'ansible_facts' from source: unknown 18911 1727096304.52186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py 18911 1727096304.52447: Sending initial data 18911 1727096304.52450: Sent initial data (160 bytes) 18911 1727096304.53293: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096304.53298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.53325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.53383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.53400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.53476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.55222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096304.55302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096304.55390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpgcf14l4p /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py <<< 18911 1727096304.55394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py" <<< 18911 1727096304.55475: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpgcf14l4p" to remote "/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py" <<< 18911 1727096304.56611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096304.56665: stderr chunk (state=3): >>><<< 18911 1727096304.56670: stdout chunk (state=3): >>><<< 18911 1727096304.56672: done transferring module to remote 18911 1727096304.56687: _low_level_execute_command(): starting 18911 1727096304.56700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/ /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py && sleep 0' 18911 1727096304.57412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.57496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.57550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.57627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096304.59648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096304.59652: stdout chunk (state=3): >>><<< 18911 1727096304.59655: stderr chunk (state=3): >>><<< 18911 1727096304.59672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096304.59684: _low_level_execute_command(): starting 18911 1727096304.59764: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/AnsiballZ_service_facts.py && sleep 0' 18911 1727096304.60330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096304.60348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096304.60364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096304.60385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096304.60403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096304.60456: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096304.60519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096304.60543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096304.60659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096304.60701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.23458: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18911 1727096306.25257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096306.25261: stdout chunk (state=3): >>><<< 18911 1727096306.25263: stderr chunk (state=3): >>><<< 18911 1727096306.25267: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096306.26020: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096306.26025: _low_level_execute_command(): starting 18911 1727096306.26029: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096304.489918-20053-30509498884131/ > /dev/null 2>&1 && sleep 0' 18911 1727096306.26835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096306.26987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.27096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.29058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096306.29281: stderr chunk (state=3): >>><<< 18911 1727096306.29285: stdout chunk (state=3): >>><<< 18911 1727096306.29288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096306.29291: handler run complete 18911 1727096306.29571: variable 'ansible_facts' from source: unknown 18911 1727096306.29974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096306.30758: variable 'ansible_facts' from source: unknown 18911 1727096306.31179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096306.31528: attempt loop complete, returning result 18911 1727096306.31879: _execute() done 18911 1727096306.31882: dumping result to json 18911 1727096306.31885: done dumping result, returning 18911 1727096306.31887: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-09a7-aae1-000000000371] 18911 1727096306.31889: sending task result for task 0afff68d-5257-09a7-aae1-000000000371 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096306.32987: no more pending results, returning what we have 18911 1727096306.32990: results queue empty 18911 1727096306.32991: checking for any_errors_fatal 18911 1727096306.32996: done checking for any_errors_fatal 18911 1727096306.32997: checking for max_fail_percentage 18911 1727096306.32999: done checking for max_fail_percentage 18911 1727096306.32999: checking to see if all hosts have failed and the running result is not ok 18911 1727096306.33000: done checking to see if all hosts have failed 18911 1727096306.33001: getting the remaining hosts for this loop 18911 1727096306.33002: done getting the remaining hosts for this loop 18911 1727096306.33006: getting the next task for host managed_node1 18911 1727096306.33014: done getting next task for host managed_node1 18911 1727096306.33017: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096306.33020: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096306.33031: getting variables 18911 1727096306.33032: in VariableManager get_vars() 18911 1727096306.33170: Calling all_inventory to load vars for managed_node1 18911 1727096306.33177: Calling groups_inventory to load vars for managed_node1 18911 1727096306.33180: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096306.33192: Calling all_plugins_play to load vars for managed_node1 18911 1727096306.33195: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096306.33198: Calling groups_plugins_play to load vars for managed_node1 18911 1727096306.34303: done sending task result for task 0afff68d-5257-09a7-aae1-000000000371 18911 1727096306.34308: WORKER PROCESS EXITING 18911 1727096306.36354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096306.40056: done with get_vars() 18911 1727096306.40289: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:58:26 -0400 (0:00:01.957) 0:00:25.519 ****** 18911 1727096306.40525: entering _queue_task() for managed_node1/package_facts 18911 1727096306.41452: worker is 1 (out of 1 available) 18911 1727096306.41467: exiting _queue_task() for managed_node1/package_facts 18911 1727096306.41482: done queuing things up, now waiting for results queue to drain 18911 1727096306.41484: waiting for pending results... 18911 1727096306.41932: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096306.42303: in run() - task 0afff68d-5257-09a7-aae1-000000000372 18911 1727096306.42474: variable 'ansible_search_path' from source: unknown 18911 1727096306.42484: variable 'ansible_search_path' from source: unknown 18911 1727096306.42487: calling self._execute() 18911 1727096306.42491: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096306.42493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096306.42496: variable 'omit' from source: magic vars 18911 1727096306.43189: variable 'ansible_distribution_major_version' from source: facts 18911 1727096306.43208: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096306.43220: variable 'omit' from source: magic vars 18911 1727096306.43282: variable 'omit' from source: magic vars 18911 1727096306.43329: variable 'omit' from source: magic vars 18911 1727096306.43372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096306.43420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096306.43451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096306.43478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096306.43495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096306.43532: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096306.43539: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096306.43546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096306.43657: Set connection var ansible_shell_executable to /bin/sh 18911 1727096306.43676: Set connection var ansible_timeout to 10 18911 1727096306.43684: Set connection var ansible_shell_type to sh 18911 1727096306.43697: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096306.43706: Set connection var ansible_pipelining to False 18911 1727096306.43715: Set connection var ansible_connection to ssh 18911 1727096306.43742: variable 'ansible_shell_executable' from source: unknown 18911 1727096306.43750: variable 'ansible_connection' from source: unknown 18911 1727096306.43758: variable 'ansible_module_compression' from source: unknown 18911 1727096306.43764: variable 'ansible_shell_type' from source: unknown 18911 1727096306.43777: variable 'ansible_shell_executable' from source: unknown 18911 1727096306.43784: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096306.43792: variable 'ansible_pipelining' from source: unknown 18911 1727096306.43798: variable 'ansible_timeout' from source: unknown 18911 1727096306.43805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096306.44330: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096306.44348: variable 'omit' from source: magic vars 18911 1727096306.44358: starting attempt loop 18911 1727096306.44366: running the handler 18911 1727096306.44394: _low_level_execute_command(): starting 18911 1727096306.44574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096306.45689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096306.45723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096306.45741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096306.45819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096306.45871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096306.45898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096306.45932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.46087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.47818: stdout chunk (state=3): >>>/root <<< 18911 1727096306.48275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096306.48279: stdout chunk (state=3): >>><<< 18911 1727096306.48281: stderr chunk (state=3): >>><<< 18911 1727096306.48285: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096306.48288: _low_level_execute_command(): starting 18911 1727096306.48291: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087 `" && echo ansible-tmp-1727096306.482081-20164-223739416044087="` echo /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087 `" ) && sleep 0' 18911 1727096306.49975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096306.50273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096306.50277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.50279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.52326: stdout chunk (state=3): >>>ansible-tmp-1727096306.482081-20164-223739416044087=/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087 <<< 18911 1727096306.52464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096306.52478: stdout chunk (state=3): >>><<< 18911 1727096306.52485: stderr chunk (state=3): >>><<< 18911 1727096306.52504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096306.482081-20164-223739416044087=/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096306.52614: variable 'ansible_module_compression' from source: unknown 18911 1727096306.52664: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18911 1727096306.52952: variable 'ansible_facts' from source: unknown 18911 1727096306.53229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py 18911 1727096306.53679: Sending initial data 18911 1727096306.53682: Sent initial data (161 bytes) 18911 1727096306.54785: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096306.54794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096306.54806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096306.54827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096306.54840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096306.54848: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096306.54981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096306.55054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096306.55150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.55246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.57013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096306.57076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096306.57497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpyzgiflfn /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py <<< 18911 1727096306.57500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py" <<< 18911 1727096306.57557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpyzgiflfn" to remote "/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py" <<< 18911 1727096306.61095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096306.61218: stderr chunk (state=3): >>><<< 18911 1727096306.61221: stdout chunk (state=3): >>><<< 18911 1727096306.61223: done transferring module to remote 18911 1727096306.61226: _low_level_execute_command(): starting 18911 1727096306.61228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/ /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py && sleep 0' 18911 1727096306.62011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096306.62023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096306.62037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096306.62053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096306.62087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096306.62183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096306.62208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096306.62236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.62340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096306.64307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096306.64311: stdout chunk (state=3): >>><<< 18911 1727096306.64313: stderr chunk (state=3): >>><<< 18911 1727096306.64464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096306.64470: _low_level_execute_command(): starting 18911 1727096306.64473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/AnsiballZ_package_facts.py && sleep 0' 18911 1727096306.65329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096306.65453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096306.65457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096306.65671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096306.65687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096306.65796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096307.10255: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name"<<< 18911 1727096307.10342: stdout chunk (state=3): >>>: "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.2<<< 18911 1727096307.10436: stdout chunk (state=3): >>>2.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": <<< 18911 1727096307.10446: stdout chunk (state=3): >>>"tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "rele<<< 18911 1727096307.10498: stdout chunk (state=3): >>>ase": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch"<<< 18911 1727096307.10522: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18911 1727096307.12445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096307.12448: stdout chunk (state=3): >>><<< 18911 1727096307.12450: stderr chunk (state=3): >>><<< 18911 1727096307.12524: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096307.14946: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096307.14950: _low_level_execute_command(): starting 18911 1727096307.14952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096306.482081-20164-223739416044087/ > /dev/null 2>&1 && sleep 0' 18911 1727096307.15556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096307.15640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096307.15644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096307.15695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096307.15713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096307.15743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096307.15856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096307.18184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096307.18188: stdout chunk (state=3): >>><<< 18911 1727096307.18191: stderr chunk (state=3): >>><<< 18911 1727096307.18193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096307.18196: handler run complete 18911 1727096307.19698: variable 'ansible_facts' from source: unknown 18911 1727096307.20177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.23976: variable 'ansible_facts' from source: unknown 18911 1727096307.24539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.26006: attempt loop complete, returning result 18911 1727096307.26029: _execute() done 18911 1727096307.26038: dumping result to json 18911 1727096307.26421: done dumping result, returning 18911 1727096307.26653: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-09a7-aae1-000000000372] 18911 1727096307.26656: sending task result for task 0afff68d-5257-09a7-aae1-000000000372 18911 1727096307.30156: done sending task result for task 0afff68d-5257-09a7-aae1-000000000372 18911 1727096307.30160: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096307.30324: no more pending results, returning what we have 18911 1727096307.30327: results queue empty 18911 1727096307.30328: checking for any_errors_fatal 18911 1727096307.30334: done checking for any_errors_fatal 18911 1727096307.30334: checking for max_fail_percentage 18911 1727096307.30336: done checking for max_fail_percentage 18911 1727096307.30337: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.30338: done checking to see if all hosts have failed 18911 1727096307.30339: getting the remaining hosts for this loop 18911 1727096307.30340: done getting the remaining hosts for this loop 18911 1727096307.30343: getting the next task for host managed_node1 18911 1727096307.30350: done getting next task for host managed_node1 18911 1727096307.30354: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096307.30356: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.30365: getting variables 18911 1727096307.30366: in VariableManager get_vars() 18911 1727096307.30402: Calling all_inventory to load vars for managed_node1 18911 1727096307.30405: Calling groups_inventory to load vars for managed_node1 18911 1727096307.30407: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.30416: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.30419: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.30422: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.32248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.35753: done with get_vars() 18911 1727096307.35989: done getting variables 18911 1727096307.36051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:58:27 -0400 (0:00:00.955) 0:00:26.475 ****** 18911 1727096307.36085: entering _queue_task() for managed_node1/debug 18911 1727096307.36844: worker is 1 (out of 1 available) 18911 1727096307.36858: exiting _queue_task() for managed_node1/debug 18911 1727096307.36871: done queuing things up, now waiting for results queue to drain 18911 1727096307.36873: waiting for pending results... 18911 1727096307.37654: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096307.37976: in run() - task 0afff68d-5257-09a7-aae1-00000000003d 18911 1727096307.37999: variable 'ansible_search_path' from source: unknown 18911 1727096307.38273: variable 'ansible_search_path' from source: unknown 18911 1727096307.38277: calling self._execute() 18911 1727096307.38280: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.38284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.38286: variable 'omit' from source: magic vars 18911 1727096307.39036: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.39472: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.39476: variable 'omit' from source: magic vars 18911 1727096307.39479: variable 'omit' from source: magic vars 18911 1727096307.39481: variable 'network_provider' from source: set_fact 18911 1727096307.39484: variable 'omit' from source: magic vars 18911 1727096307.39972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096307.39976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096307.39980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096307.39982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096307.39985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096307.39987: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096307.39989: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.39991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.40176: Set connection var ansible_shell_executable to /bin/sh 18911 1727096307.40236: Set connection var ansible_timeout to 10 18911 1727096307.40246: Set connection var ansible_shell_type to sh 18911 1727096307.40283: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096307.40294: Set connection var ansible_pipelining to False 18911 1727096307.40304: Set connection var ansible_connection to ssh 18911 1727096307.40359: variable 'ansible_shell_executable' from source: unknown 18911 1727096307.40481: variable 'ansible_connection' from source: unknown 18911 1727096307.40490: variable 'ansible_module_compression' from source: unknown 18911 1727096307.40499: variable 'ansible_shell_type' from source: unknown 18911 1727096307.40503: variable 'ansible_shell_executable' from source: unknown 18911 1727096307.40506: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.40508: variable 'ansible_pipelining' from source: unknown 18911 1727096307.40511: variable 'ansible_timeout' from source: unknown 18911 1727096307.40514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.40791: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096307.40812: variable 'omit' from source: magic vars 18911 1727096307.40848: starting attempt loop 18911 1727096307.40857: running the handler 18911 1727096307.41045: handler run complete 18911 1727096307.41071: attempt loop complete, returning result 18911 1727096307.41080: _execute() done 18911 1727096307.41089: dumping result to json 18911 1727096307.41096: done dumping result, returning 18911 1727096307.41109: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-09a7-aae1-00000000003d] 18911 1727096307.41122: sending task result for task 0afff68d-5257-09a7-aae1-00000000003d ok: [managed_node1] => {} MSG: Using network provider: nm 18911 1727096307.41324: no more pending results, returning what we have 18911 1727096307.41328: results queue empty 18911 1727096307.41329: checking for any_errors_fatal 18911 1727096307.41341: done checking for any_errors_fatal 18911 1727096307.41342: checking for max_fail_percentage 18911 1727096307.41343: done checking for max_fail_percentage 18911 1727096307.41344: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.41345: done checking to see if all hosts have failed 18911 1727096307.41345: getting the remaining hosts for this loop 18911 1727096307.41347: done getting the remaining hosts for this loop 18911 1727096307.41350: getting the next task for host managed_node1 18911 1727096307.41357: done getting next task for host managed_node1 18911 1727096307.41361: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096307.41363: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.41375: getting variables 18911 1727096307.41376: in VariableManager get_vars() 18911 1727096307.41413: Calling all_inventory to load vars for managed_node1 18911 1727096307.41415: Calling groups_inventory to load vars for managed_node1 18911 1727096307.41417: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.41427: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.41429: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.41431: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.42076: done sending task result for task 0afff68d-5257-09a7-aae1-00000000003d 18911 1727096307.42080: WORKER PROCESS EXITING 18911 1727096307.44460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.46866: done with get_vars() 18911 1727096307.46901: done getting variables 18911 1727096307.46988: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:58:27 -0400 (0:00:00.109) 0:00:26.585 ****** 18911 1727096307.47065: entering _queue_task() for managed_node1/fail 18911 1727096307.47440: worker is 1 (out of 1 available) 18911 1727096307.47453: exiting _queue_task() for managed_node1/fail 18911 1727096307.47465: done queuing things up, now waiting for results queue to drain 18911 1727096307.47466: waiting for pending results... 18911 1727096307.47751: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096307.47888: in run() - task 0afff68d-5257-09a7-aae1-00000000003e 18911 1727096307.47987: variable 'ansible_search_path' from source: unknown 18911 1727096307.47991: variable 'ansible_search_path' from source: unknown 18911 1727096307.47994: calling self._execute() 18911 1727096307.48078: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.48097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.48115: variable 'omit' from source: magic vars 18911 1727096307.48514: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.48538: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.48679: variable 'network_state' from source: role '' defaults 18911 1727096307.48700: Evaluated conditional (network_state != {}): False 18911 1727096307.48707: when evaluation is False, skipping this task 18911 1727096307.48715: _execute() done 18911 1727096307.48723: dumping result to json 18911 1727096307.48729: done dumping result, returning 18911 1727096307.48775: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-09a7-aae1-00000000003e] 18911 1727096307.48837: sending task result for task 0afff68d-5257-09a7-aae1-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096307.48990: no more pending results, returning what we have 18911 1727096307.48993: results queue empty 18911 1727096307.48994: checking for any_errors_fatal 18911 1727096307.49002: done checking for any_errors_fatal 18911 1727096307.49003: checking for max_fail_percentage 18911 1727096307.49004: done checking for max_fail_percentage 18911 1727096307.49009: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.49010: done checking to see if all hosts have failed 18911 1727096307.49011: getting the remaining hosts for this loop 18911 1727096307.49012: done getting the remaining hosts for this loop 18911 1727096307.49016: getting the next task for host managed_node1 18911 1727096307.49022: done getting next task for host managed_node1 18911 1727096307.49026: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096307.49028: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.49042: getting variables 18911 1727096307.49044: in VariableManager get_vars() 18911 1727096307.49086: Calling all_inventory to load vars for managed_node1 18911 1727096307.49089: Calling groups_inventory to load vars for managed_node1 18911 1727096307.49091: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.49103: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.49106: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.49109: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.49883: done sending task result for task 0afff68d-5257-09a7-aae1-00000000003e 18911 1727096307.49886: WORKER PROCESS EXITING 18911 1727096307.51051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.53125: done with get_vars() 18911 1727096307.53150: done getting variables 18911 1727096307.53221: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:58:27 -0400 (0:00:00.061) 0:00:26.646 ****** 18911 1727096307.53254: entering _queue_task() for managed_node1/fail 18911 1727096307.53614: worker is 1 (out of 1 available) 18911 1727096307.53628: exiting _queue_task() for managed_node1/fail 18911 1727096307.53641: done queuing things up, now waiting for results queue to drain 18911 1727096307.53642: waiting for pending results... 18911 1727096307.54345: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096307.54440: in run() - task 0afff68d-5257-09a7-aae1-00000000003f 18911 1727096307.54451: variable 'ansible_search_path' from source: unknown 18911 1727096307.54454: variable 'ansible_search_path' from source: unknown 18911 1727096307.54566: calling self._execute() 18911 1727096307.54757: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.54761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.54766: variable 'omit' from source: magic vars 18911 1727096307.55170: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.55195: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.55329: variable 'network_state' from source: role '' defaults 18911 1727096307.55345: Evaluated conditional (network_state != {}): False 18911 1727096307.55353: when evaluation is False, skipping this task 18911 1727096307.55360: _execute() done 18911 1727096307.55373: dumping result to json 18911 1727096307.55381: done dumping result, returning 18911 1727096307.55394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-09a7-aae1-00000000003f] 18911 1727096307.55412: sending task result for task 0afff68d-5257-09a7-aae1-00000000003f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096307.55566: no more pending results, returning what we have 18911 1727096307.55570: results queue empty 18911 1727096307.55572: checking for any_errors_fatal 18911 1727096307.55580: done checking for any_errors_fatal 18911 1727096307.55580: checking for max_fail_percentage 18911 1727096307.55582: done checking for max_fail_percentage 18911 1727096307.55583: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.55584: done checking to see if all hosts have failed 18911 1727096307.55584: getting the remaining hosts for this loop 18911 1727096307.55586: done getting the remaining hosts for this loop 18911 1727096307.55589: getting the next task for host managed_node1 18911 1727096307.55596: done getting next task for host managed_node1 18911 1727096307.55599: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096307.55601: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.55615: getting variables 18911 1727096307.55617: in VariableManager get_vars() 18911 1727096307.55653: Calling all_inventory to load vars for managed_node1 18911 1727096307.55656: Calling groups_inventory to load vars for managed_node1 18911 1727096307.55658: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.55673: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.55676: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.55679: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.56198: done sending task result for task 0afff68d-5257-09a7-aae1-00000000003f 18911 1727096307.56202: WORKER PROCESS EXITING 18911 1727096307.57411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.59556: done with get_vars() 18911 1727096307.59604: done getting variables 18911 1727096307.59673: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:58:27 -0400 (0:00:00.064) 0:00:26.711 ****** 18911 1727096307.59714: entering _queue_task() for managed_node1/fail 18911 1727096307.60288: worker is 1 (out of 1 available) 18911 1727096307.60301: exiting _queue_task() for managed_node1/fail 18911 1727096307.60313: done queuing things up, now waiting for results queue to drain 18911 1727096307.60314: waiting for pending results... 18911 1727096307.60643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096307.60657: in run() - task 0afff68d-5257-09a7-aae1-000000000040 18911 1727096307.60674: variable 'ansible_search_path' from source: unknown 18911 1727096307.60678: variable 'ansible_search_path' from source: unknown 18911 1727096307.60719: calling self._execute() 18911 1727096307.60825: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.60829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.60845: variable 'omit' from source: magic vars 18911 1727096307.61240: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.61244: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.61549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096307.64472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096307.64530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096307.64569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096307.64600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096307.64629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096307.64712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.65082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.65110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.65150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.65182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.65282: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.65286: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18911 1727096307.65421: variable 'ansible_distribution' from source: facts 18911 1727096307.65424: variable '__network_rh_distros' from source: role '' defaults 18911 1727096307.65426: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18911 1727096307.65805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.65808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.65811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.65816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.65819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.65827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.65853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.65880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.65918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.65936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.66172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.66176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.66178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.66181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.66184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.66368: variable 'network_connections' from source: play vars 18911 1727096307.66378: variable 'profile' from source: play vars 18911 1727096307.66449: variable 'profile' from source: play vars 18911 1727096307.66452: variable 'interface' from source: set_fact 18911 1727096307.66519: variable 'interface' from source: set_fact 18911 1727096307.66523: variable 'network_state' from source: role '' defaults 18911 1727096307.66593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096307.66752: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096307.66799: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096307.66824: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096307.66851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096307.66894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096307.66921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096307.66945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.66972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096307.66995: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18911 1727096307.66999: when evaluation is False, skipping this task 18911 1727096307.67001: _execute() done 18911 1727096307.67004: dumping result to json 18911 1727096307.67006: done dumping result, returning 18911 1727096307.67020: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-09a7-aae1-000000000040] 18911 1727096307.67025: sending task result for task 0afff68d-5257-09a7-aae1-000000000040 18911 1727096307.67121: done sending task result for task 0afff68d-5257-09a7-aae1-000000000040 18911 1727096307.67125: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18911 1727096307.67175: no more pending results, returning what we have 18911 1727096307.67178: results queue empty 18911 1727096307.67179: checking for any_errors_fatal 18911 1727096307.67185: done checking for any_errors_fatal 18911 1727096307.67186: checking for max_fail_percentage 18911 1727096307.67188: done checking for max_fail_percentage 18911 1727096307.67189: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.67190: done checking to see if all hosts have failed 18911 1727096307.67191: getting the remaining hosts for this loop 18911 1727096307.67192: done getting the remaining hosts for this loop 18911 1727096307.67196: getting the next task for host managed_node1 18911 1727096307.67203: done getting next task for host managed_node1 18911 1727096307.67208: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096307.67210: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.67223: getting variables 18911 1727096307.67226: in VariableManager get_vars() 18911 1727096307.67265: Calling all_inventory to load vars for managed_node1 18911 1727096307.67270: Calling groups_inventory to load vars for managed_node1 18911 1727096307.67273: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.67284: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.67287: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.67290: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.69084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.70603: done with get_vars() 18911 1727096307.70631: done getting variables 18911 1727096307.70692: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:58:27 -0400 (0:00:00.110) 0:00:26.821 ****** 18911 1727096307.70722: entering _queue_task() for managed_node1/dnf 18911 1727096307.71074: worker is 1 (out of 1 available) 18911 1727096307.71086: exiting _queue_task() for managed_node1/dnf 18911 1727096307.71097: done queuing things up, now waiting for results queue to drain 18911 1727096307.71098: waiting for pending results... 18911 1727096307.71420: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096307.71469: in run() - task 0afff68d-5257-09a7-aae1-000000000041 18911 1727096307.71509: variable 'ansible_search_path' from source: unknown 18911 1727096307.71520: variable 'ansible_search_path' from source: unknown 18911 1727096307.71524: calling self._execute() 18911 1727096307.71636: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.71640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.71643: variable 'omit' from source: magic vars 18911 1727096307.72475: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.72479: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.72482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096307.74455: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096307.74520: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096307.74565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096307.74601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096307.74628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096307.74729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.74766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.74790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.74831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.74845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.74974: variable 'ansible_distribution' from source: facts 18911 1727096307.74978: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.74995: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18911 1727096307.75116: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096307.75248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.75273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.75301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.75341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.75354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.75401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.75425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.75450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.75490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.75509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.75549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.75573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.75597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.75640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.75658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.75814: variable 'network_connections' from source: play vars 18911 1727096307.75826: variable 'profile' from source: play vars 18911 1727096307.75906: variable 'profile' from source: play vars 18911 1727096307.75909: variable 'interface' from source: set_fact 18911 1727096307.75972: variable 'interface' from source: set_fact 18911 1727096307.76040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096307.76210: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096307.76246: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096307.76280: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096307.76311: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096307.76349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096307.76370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096307.76399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.76473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096307.76477: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096307.76716: variable 'network_connections' from source: play vars 18911 1727096307.76719: variable 'profile' from source: play vars 18911 1727096307.76785: variable 'profile' from source: play vars 18911 1727096307.76789: variable 'interface' from source: set_fact 18911 1727096307.76854: variable 'interface' from source: set_fact 18911 1727096307.76880: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096307.76883: when evaluation is False, skipping this task 18911 1727096307.76886: _execute() done 18911 1727096307.76888: dumping result to json 18911 1727096307.76890: done dumping result, returning 18911 1727096307.76959: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000041] 18911 1727096307.76965: sending task result for task 0afff68d-5257-09a7-aae1-000000000041 18911 1727096307.77029: done sending task result for task 0afff68d-5257-09a7-aae1-000000000041 18911 1727096307.77033: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096307.77092: no more pending results, returning what we have 18911 1727096307.77095: results queue empty 18911 1727096307.77096: checking for any_errors_fatal 18911 1727096307.77105: done checking for any_errors_fatal 18911 1727096307.77106: checking for max_fail_percentage 18911 1727096307.77107: done checking for max_fail_percentage 18911 1727096307.77108: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.77109: done checking to see if all hosts have failed 18911 1727096307.77110: getting the remaining hosts for this loop 18911 1727096307.77111: done getting the remaining hosts for this loop 18911 1727096307.77115: getting the next task for host managed_node1 18911 1727096307.77122: done getting next task for host managed_node1 18911 1727096307.77127: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096307.77129: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.77143: getting variables 18911 1727096307.77145: in VariableManager get_vars() 18911 1727096307.77187: Calling all_inventory to load vars for managed_node1 18911 1727096307.77190: Calling groups_inventory to load vars for managed_node1 18911 1727096307.77192: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.77203: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.77206: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.77209: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.78831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.80451: done with get_vars() 18911 1727096307.80484: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096307.80568: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:58:27 -0400 (0:00:00.098) 0:00:26.920 ****** 18911 1727096307.80599: entering _queue_task() for managed_node1/yum 18911 1727096307.80958: worker is 1 (out of 1 available) 18911 1727096307.80977: exiting _queue_task() for managed_node1/yum 18911 1727096307.80989: done queuing things up, now waiting for results queue to drain 18911 1727096307.80990: waiting for pending results... 18911 1727096307.81323: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096307.81396: in run() - task 0afff68d-5257-09a7-aae1-000000000042 18911 1727096307.81422: variable 'ansible_search_path' from source: unknown 18911 1727096307.81473: variable 'ansible_search_path' from source: unknown 18911 1727096307.81477: calling self._execute() 18911 1727096307.81574: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.81585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.81599: variable 'omit' from source: magic vars 18911 1727096307.81972: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.81990: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.82151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096307.85152: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096307.85157: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096307.85189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096307.85231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096307.85260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096307.85375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.85378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.85395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.85442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.85457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.85561: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.85590: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18911 1727096307.85594: when evaluation is False, skipping this task 18911 1727096307.85597: _execute() done 18911 1727096307.85599: dumping result to json 18911 1727096307.85601: done dumping result, returning 18911 1727096307.85603: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000042] 18911 1727096307.85606: sending task result for task 0afff68d-5257-09a7-aae1-000000000042 18911 1727096307.85816: done sending task result for task 0afff68d-5257-09a7-aae1-000000000042 18911 1727096307.85820: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18911 1727096307.85882: no more pending results, returning what we have 18911 1727096307.85886: results queue empty 18911 1727096307.85887: checking for any_errors_fatal 18911 1727096307.85893: done checking for any_errors_fatal 18911 1727096307.85894: checking for max_fail_percentage 18911 1727096307.85896: done checking for max_fail_percentage 18911 1727096307.85898: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.85898: done checking to see if all hosts have failed 18911 1727096307.85899: getting the remaining hosts for this loop 18911 1727096307.85900: done getting the remaining hosts for this loop 18911 1727096307.85904: getting the next task for host managed_node1 18911 1727096307.85911: done getting next task for host managed_node1 18911 1727096307.85915: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096307.85917: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.85931: getting variables 18911 1727096307.85933: in VariableManager get_vars() 18911 1727096307.85977: Calling all_inventory to load vars for managed_node1 18911 1727096307.85980: Calling groups_inventory to load vars for managed_node1 18911 1727096307.85983: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.85994: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.85996: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.85999: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.87650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.89260: done with get_vars() 18911 1727096307.89291: done getting variables 18911 1727096307.89350: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:58:27 -0400 (0:00:00.087) 0:00:27.008 ****** 18911 1727096307.89383: entering _queue_task() for managed_node1/fail 18911 1727096307.89728: worker is 1 (out of 1 available) 18911 1727096307.89740: exiting _queue_task() for managed_node1/fail 18911 1727096307.89753: done queuing things up, now waiting for results queue to drain 18911 1727096307.89754: waiting for pending results... 18911 1727096307.90187: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096307.90192: in run() - task 0afff68d-5257-09a7-aae1-000000000043 18911 1727096307.90195: variable 'ansible_search_path' from source: unknown 18911 1727096307.90198: variable 'ansible_search_path' from source: unknown 18911 1727096307.90200: calling self._execute() 18911 1727096307.90314: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096307.90318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096307.90373: variable 'omit' from source: magic vars 18911 1727096307.90731: variable 'ansible_distribution_major_version' from source: facts 18911 1727096307.90742: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096307.90873: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096307.91074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096307.93353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096307.93430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096307.93481: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096307.93590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096307.93593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096307.93617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.93661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.93698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.93736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.93750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.93809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.93831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.93855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.93895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.93918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.93960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096307.93985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096307.94012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.94049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096307.94065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096307.94620: variable 'network_connections' from source: play vars 18911 1727096307.94623: variable 'profile' from source: play vars 18911 1727096307.94625: variable 'profile' from source: play vars 18911 1727096307.94627: variable 'interface' from source: set_fact 18911 1727096307.94774: variable 'interface' from source: set_fact 18911 1727096307.94777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096307.95174: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096307.95179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096307.95181: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096307.95184: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096307.95186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096307.95189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096307.95191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096307.95194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096307.95196: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096307.95389: variable 'network_connections' from source: play vars 18911 1727096307.95421: variable 'profile' from source: play vars 18911 1727096307.95454: variable 'profile' from source: play vars 18911 1727096307.95457: variable 'interface' from source: set_fact 18911 1727096307.95521: variable 'interface' from source: set_fact 18911 1727096307.95582: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096307.95585: when evaluation is False, skipping this task 18911 1727096307.95588: _execute() done 18911 1727096307.95590: dumping result to json 18911 1727096307.95592: done dumping result, returning 18911 1727096307.95594: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000043] 18911 1727096307.95605: sending task result for task 0afff68d-5257-09a7-aae1-000000000043 18911 1727096307.95869: done sending task result for task 0afff68d-5257-09a7-aae1-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096307.95919: no more pending results, returning what we have 18911 1727096307.95922: results queue empty 18911 1727096307.95923: checking for any_errors_fatal 18911 1727096307.95928: done checking for any_errors_fatal 18911 1727096307.95929: checking for max_fail_percentage 18911 1727096307.95931: done checking for max_fail_percentage 18911 1727096307.95932: checking to see if all hosts have failed and the running result is not ok 18911 1727096307.95933: done checking to see if all hosts have failed 18911 1727096307.95933: getting the remaining hosts for this loop 18911 1727096307.95935: done getting the remaining hosts for this loop 18911 1727096307.95938: getting the next task for host managed_node1 18911 1727096307.95943: done getting next task for host managed_node1 18911 1727096307.95948: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18911 1727096307.95950: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096307.95963: getting variables 18911 1727096307.95965: in VariableManager get_vars() 18911 1727096307.96003: Calling all_inventory to load vars for managed_node1 18911 1727096307.96005: Calling groups_inventory to load vars for managed_node1 18911 1727096307.96007: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096307.96018: Calling all_plugins_play to load vars for managed_node1 18911 1727096307.96021: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096307.96025: Calling groups_plugins_play to load vars for managed_node1 18911 1727096307.96543: WORKER PROCESS EXITING 18911 1727096307.97784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096307.99798: done with get_vars() 18911 1727096307.99824: done getting variables 18911 1727096307.99903: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:58:27 -0400 (0:00:00.105) 0:00:27.113 ****** 18911 1727096307.99935: entering _queue_task() for managed_node1/package 18911 1727096308.00328: worker is 1 (out of 1 available) 18911 1727096308.00457: exiting _queue_task() for managed_node1/package 18911 1727096308.00474: done queuing things up, now waiting for results queue to drain 18911 1727096308.00476: waiting for pending results... 18911 1727096308.00677: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18911 1727096308.00824: in run() - task 0afff68d-5257-09a7-aae1-000000000044 18911 1727096308.00846: variable 'ansible_search_path' from source: unknown 18911 1727096308.00855: variable 'ansible_search_path' from source: unknown 18911 1727096308.00914: calling self._execute() 18911 1727096308.01039: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.01074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.01078: variable 'omit' from source: magic vars 18911 1727096308.01520: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.01568: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096308.01794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096308.02115: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096308.02155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096308.02221: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096308.02283: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096308.02412: variable 'network_packages' from source: role '' defaults 18911 1727096308.02675: variable '__network_provider_setup' from source: role '' defaults 18911 1727096308.02678: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096308.02680: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096308.02682: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096308.02724: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096308.02929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096308.05226: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096308.05336: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096308.05340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096308.05444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096308.05447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096308.05476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.05490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.05515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.05554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.05571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.05615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.05638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.05660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.05700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.05713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.05921: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096308.06023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.06047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.06101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.06111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.06126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.06216: variable 'ansible_python' from source: facts 18911 1727096308.06246: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096308.06329: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096308.06411: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096308.06552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.06577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.06602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.06673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.06676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.06700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.06723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.06770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.06787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.06836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.06939: variable 'network_connections' from source: play vars 18911 1727096308.06946: variable 'profile' from source: play vars 18911 1727096308.07048: variable 'profile' from source: play vars 18911 1727096308.07055: variable 'interface' from source: set_fact 18911 1727096308.07272: variable 'interface' from source: set_fact 18911 1727096308.07276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096308.07278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096308.07281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.07283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096308.07326: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096308.07604: variable 'network_connections' from source: play vars 18911 1727096308.07607: variable 'profile' from source: play vars 18911 1727096308.07705: variable 'profile' from source: play vars 18911 1727096308.07712: variable 'interface' from source: set_fact 18911 1727096308.07780: variable 'interface' from source: set_fact 18911 1727096308.07814: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096308.07897: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096308.08194: variable 'network_connections' from source: play vars 18911 1727096308.08198: variable 'profile' from source: play vars 18911 1727096308.08277: variable 'profile' from source: play vars 18911 1727096308.08281: variable 'interface' from source: set_fact 18911 1727096308.08386: variable 'interface' from source: set_fact 18911 1727096308.08389: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096308.08459: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096308.08760: variable 'network_connections' from source: play vars 18911 1727096308.08766: variable 'profile' from source: play vars 18911 1727096308.08879: variable 'profile' from source: play vars 18911 1727096308.08882: variable 'interface' from source: set_fact 18911 1727096308.08931: variable 'interface' from source: set_fact 18911 1727096308.08979: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096308.09040: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096308.09044: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096308.09097: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096308.09291: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096308.09800: variable 'network_connections' from source: play vars 18911 1727096308.09803: variable 'profile' from source: play vars 18911 1727096308.09805: variable 'profile' from source: play vars 18911 1727096308.09808: variable 'interface' from source: set_fact 18911 1727096308.09840: variable 'interface' from source: set_fact 18911 1727096308.09849: variable 'ansible_distribution' from source: facts 18911 1727096308.09852: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.09859: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.09876: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096308.10532: variable 'ansible_distribution' from source: facts 18911 1727096308.10536: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.10538: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.10540: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096308.10542: variable 'ansible_distribution' from source: facts 18911 1727096308.10544: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.10546: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.10548: variable 'network_provider' from source: set_fact 18911 1727096308.10550: variable 'ansible_facts' from source: unknown 18911 1727096308.11265: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18911 1727096308.11270: when evaluation is False, skipping this task 18911 1727096308.11273: _execute() done 18911 1727096308.11275: dumping result to json 18911 1727096308.11277: done dumping result, returning 18911 1727096308.11408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-09a7-aae1-000000000044] 18911 1727096308.11412: sending task result for task 0afff68d-5257-09a7-aae1-000000000044 18911 1727096308.11482: done sending task result for task 0afff68d-5257-09a7-aae1-000000000044 18911 1727096308.11484: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18911 1727096308.11533: no more pending results, returning what we have 18911 1727096308.11537: results queue empty 18911 1727096308.11538: checking for any_errors_fatal 18911 1727096308.11544: done checking for any_errors_fatal 18911 1727096308.11545: checking for max_fail_percentage 18911 1727096308.11547: done checking for max_fail_percentage 18911 1727096308.11548: checking to see if all hosts have failed and the running result is not ok 18911 1727096308.11548: done checking to see if all hosts have failed 18911 1727096308.11549: getting the remaining hosts for this loop 18911 1727096308.11550: done getting the remaining hosts for this loop 18911 1727096308.11553: getting the next task for host managed_node1 18911 1727096308.11560: done getting next task for host managed_node1 18911 1727096308.11566: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096308.11569: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096308.11582: getting variables 18911 1727096308.11584: in VariableManager get_vars() 18911 1727096308.11620: Calling all_inventory to load vars for managed_node1 18911 1727096308.11623: Calling groups_inventory to load vars for managed_node1 18911 1727096308.11625: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096308.11639: Calling all_plugins_play to load vars for managed_node1 18911 1727096308.11642: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096308.11644: Calling groups_plugins_play to load vars for managed_node1 18911 1727096308.13231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096308.16098: done with get_vars() 18911 1727096308.16135: done getting variables 18911 1727096308.16302: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:58:28 -0400 (0:00:00.163) 0:00:27.277 ****** 18911 1727096308.16332: entering _queue_task() for managed_node1/package 18911 1727096308.17096: worker is 1 (out of 1 available) 18911 1727096308.17109: exiting _queue_task() for managed_node1/package 18911 1727096308.17121: done queuing things up, now waiting for results queue to drain 18911 1727096308.17122: waiting for pending results... 18911 1727096308.17786: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096308.17975: in run() - task 0afff68d-5257-09a7-aae1-000000000045 18911 1727096308.17979: variable 'ansible_search_path' from source: unknown 18911 1727096308.17982: variable 'ansible_search_path' from source: unknown 18911 1727096308.17985: calling self._execute() 18911 1727096308.18244: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.18248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.18261: variable 'omit' from source: magic vars 18911 1727096308.19078: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.19128: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096308.19818: variable 'network_state' from source: role '' defaults 18911 1727096308.19830: Evaluated conditional (network_state != {}): False 18911 1727096308.19833: when evaluation is False, skipping this task 18911 1727096308.19836: _execute() done 18911 1727096308.19841: dumping result to json 18911 1727096308.19843: done dumping result, returning 18911 1727096308.19852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000045] 18911 1727096308.19855: sending task result for task 0afff68d-5257-09a7-aae1-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096308.20052: no more pending results, returning what we have 18911 1727096308.20056: results queue empty 18911 1727096308.20057: checking for any_errors_fatal 18911 1727096308.20066: done checking for any_errors_fatal 18911 1727096308.20069: checking for max_fail_percentage 18911 1727096308.20071: done checking for max_fail_percentage 18911 1727096308.20072: checking to see if all hosts have failed and the running result is not ok 18911 1727096308.20073: done checking to see if all hosts have failed 18911 1727096308.20074: getting the remaining hosts for this loop 18911 1727096308.20075: done getting the remaining hosts for this loop 18911 1727096308.20079: getting the next task for host managed_node1 18911 1727096308.20087: done getting next task for host managed_node1 18911 1727096308.20091: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096308.20094: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096308.20110: getting variables 18911 1727096308.20112: in VariableManager get_vars() 18911 1727096308.20153: Calling all_inventory to load vars for managed_node1 18911 1727096308.20156: Calling groups_inventory to load vars for managed_node1 18911 1727096308.20158: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096308.20384: Calling all_plugins_play to load vars for managed_node1 18911 1727096308.20388: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096308.20392: Calling groups_plugins_play to load vars for managed_node1 18911 1727096308.21085: done sending task result for task 0afff68d-5257-09a7-aae1-000000000045 18911 1727096308.21088: WORKER PROCESS EXITING 18911 1727096308.23876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096308.27066: done with get_vars() 18911 1727096308.27101: done getting variables 18911 1727096308.27158: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:58:28 -0400 (0:00:00.110) 0:00:27.388 ****** 18911 1727096308.27399: entering _queue_task() for managed_node1/package 18911 1727096308.27959: worker is 1 (out of 1 available) 18911 1727096308.28176: exiting _queue_task() for managed_node1/package 18911 1727096308.28187: done queuing things up, now waiting for results queue to drain 18911 1727096308.28188: waiting for pending results... 18911 1727096308.28453: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096308.28815: in run() - task 0afff68d-5257-09a7-aae1-000000000046 18911 1727096308.28826: variable 'ansible_search_path' from source: unknown 18911 1727096308.28830: variable 'ansible_search_path' from source: unknown 18911 1727096308.28876: calling self._execute() 18911 1727096308.29095: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.29099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.29114: variable 'omit' from source: magic vars 18911 1727096308.30019: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.30029: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096308.30583: variable 'network_state' from source: role '' defaults 18911 1727096308.30596: Evaluated conditional (network_state != {}): False 18911 1727096308.30599: when evaluation is False, skipping this task 18911 1727096308.30602: _execute() done 18911 1727096308.30605: dumping result to json 18911 1727096308.30607: done dumping result, returning 18911 1727096308.30617: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000046] 18911 1727096308.30622: sending task result for task 0afff68d-5257-09a7-aae1-000000000046 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096308.30816: no more pending results, returning what we have 18911 1727096308.30819: results queue empty 18911 1727096308.30820: checking for any_errors_fatal 18911 1727096308.30826: done checking for any_errors_fatal 18911 1727096308.30827: checking for max_fail_percentage 18911 1727096308.30829: done checking for max_fail_percentage 18911 1727096308.30830: checking to see if all hosts have failed and the running result is not ok 18911 1727096308.30831: done checking to see if all hosts have failed 18911 1727096308.30832: getting the remaining hosts for this loop 18911 1727096308.30833: done getting the remaining hosts for this loop 18911 1727096308.30837: getting the next task for host managed_node1 18911 1727096308.30844: done getting next task for host managed_node1 18911 1727096308.30847: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096308.30850: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096308.30865: getting variables 18911 1727096308.30868: in VariableManager get_vars() 18911 1727096308.30910: Calling all_inventory to load vars for managed_node1 18911 1727096308.30913: Calling groups_inventory to load vars for managed_node1 18911 1727096308.30916: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096308.30928: Calling all_plugins_play to load vars for managed_node1 18911 1727096308.30931: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096308.30934: Calling groups_plugins_play to load vars for managed_node1 18911 1727096308.31501: done sending task result for task 0afff68d-5257-09a7-aae1-000000000046 18911 1727096308.31504: WORKER PROCESS EXITING 18911 1727096308.34307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096308.37503: done with get_vars() 18911 1727096308.37534: done getting variables 18911 1727096308.37592: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:58:28 -0400 (0:00:00.102) 0:00:27.490 ****** 18911 1727096308.37625: entering _queue_task() for managed_node1/service 18911 1727096308.38381: worker is 1 (out of 1 available) 18911 1727096308.38394: exiting _queue_task() for managed_node1/service 18911 1727096308.38407: done queuing things up, now waiting for results queue to drain 18911 1727096308.38408: waiting for pending results... 18911 1727096308.38862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096308.39239: in run() - task 0afff68d-5257-09a7-aae1-000000000047 18911 1727096308.39252: variable 'ansible_search_path' from source: unknown 18911 1727096308.39255: variable 'ansible_search_path' from source: unknown 18911 1727096308.39297: calling self._execute() 18911 1727096308.40101: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.40107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.40119: variable 'omit' from source: magic vars 18911 1727096308.40895: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.40906: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096308.41032: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096308.41331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096308.45375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096308.45545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096308.45773: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096308.45779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096308.45782: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096308.45942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.46043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.46118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.46122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.46125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.46315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.46338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.46427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.46464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.46623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.46773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.46988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.47010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.47047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.47061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.47229: variable 'network_connections' from source: play vars 18911 1727096308.47241: variable 'profile' from source: play vars 18911 1727096308.47517: variable 'profile' from source: play vars 18911 1727096308.47525: variable 'interface' from source: set_fact 18911 1727096308.47586: variable 'interface' from source: set_fact 18911 1727096308.47658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096308.54678: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096308.54683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096308.54685: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096308.54688: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096308.54690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096308.54692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096308.54694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.54696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096308.54698: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096308.55000: variable 'network_connections' from source: play vars 18911 1727096308.55004: variable 'profile' from source: play vars 18911 1727096308.55007: variable 'profile' from source: play vars 18911 1727096308.55012: variable 'interface' from source: set_fact 18911 1727096308.55180: variable 'interface' from source: set_fact 18911 1727096308.55202: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096308.55205: when evaluation is False, skipping this task 18911 1727096308.55208: _execute() done 18911 1727096308.55212: dumping result to json 18911 1727096308.55214: done dumping result, returning 18911 1727096308.55219: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000047] 18911 1727096308.55228: sending task result for task 0afff68d-5257-09a7-aae1-000000000047 18911 1727096308.55308: done sending task result for task 0afff68d-5257-09a7-aae1-000000000047 18911 1727096308.55311: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096308.55390: no more pending results, returning what we have 18911 1727096308.55393: results queue empty 18911 1727096308.55394: checking for any_errors_fatal 18911 1727096308.55400: done checking for any_errors_fatal 18911 1727096308.55400: checking for max_fail_percentage 18911 1727096308.55402: done checking for max_fail_percentage 18911 1727096308.55403: checking to see if all hosts have failed and the running result is not ok 18911 1727096308.55404: done checking to see if all hosts have failed 18911 1727096308.55405: getting the remaining hosts for this loop 18911 1727096308.55406: done getting the remaining hosts for this loop 18911 1727096308.55410: getting the next task for host managed_node1 18911 1727096308.55416: done getting next task for host managed_node1 18911 1727096308.55420: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096308.55422: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096308.55437: getting variables 18911 1727096308.55440: in VariableManager get_vars() 18911 1727096308.55477: Calling all_inventory to load vars for managed_node1 18911 1727096308.55479: Calling groups_inventory to load vars for managed_node1 18911 1727096308.55481: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096308.55490: Calling all_plugins_play to load vars for managed_node1 18911 1727096308.55492: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096308.55494: Calling groups_plugins_play to load vars for managed_node1 18911 1727096308.67034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096308.69566: done with get_vars() 18911 1727096308.69605: done getting variables 18911 1727096308.69657: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:58:28 -0400 (0:00:00.320) 0:00:27.811 ****** 18911 1727096308.69685: entering _queue_task() for managed_node1/service 18911 1727096308.70044: worker is 1 (out of 1 available) 18911 1727096308.70192: exiting _queue_task() for managed_node1/service 18911 1727096308.70202: done queuing things up, now waiting for results queue to drain 18911 1727096308.70204: waiting for pending results... 18911 1727096308.70697: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096308.70703: in run() - task 0afff68d-5257-09a7-aae1-000000000048 18911 1727096308.70719: variable 'ansible_search_path' from source: unknown 18911 1727096308.70726: variable 'ansible_search_path' from source: unknown 18911 1727096308.70770: calling self._execute() 18911 1727096308.70879: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.70895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.70913: variable 'omit' from source: magic vars 18911 1727096308.71413: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.71432: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096308.71648: variable 'network_provider' from source: set_fact 18911 1727096308.71735: variable 'network_state' from source: role '' defaults 18911 1727096308.71739: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18911 1727096308.71742: variable 'omit' from source: magic vars 18911 1727096308.71745: variable 'omit' from source: magic vars 18911 1727096308.71777: variable 'network_service_name' from source: role '' defaults 18911 1727096308.71863: variable 'network_service_name' from source: role '' defaults 18911 1727096308.71988: variable '__network_provider_setup' from source: role '' defaults 18911 1727096308.72000: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096308.72068: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096308.72083: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096308.72162: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096308.72418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096308.74787: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096308.74942: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096308.74945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096308.74957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096308.74996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096308.75089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.75129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.75161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.75219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.75240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.75293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.75372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.75375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.75394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.75417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.75664: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096308.75783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.75810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.75837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.75888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.75973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.76005: variable 'ansible_python' from source: facts 18911 1727096308.76029: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096308.76116: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096308.76202: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096308.76335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.76363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.76392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.76439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.76454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.76518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096308.76572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096308.76576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.76627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096308.76772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096308.76781: variable 'network_connections' from source: play vars 18911 1727096308.76794: variable 'profile' from source: play vars 18911 1727096308.76869: variable 'profile' from source: play vars 18911 1727096308.76886: variable 'interface' from source: set_fact 18911 1727096308.76951: variable 'interface' from source: set_fact 18911 1727096308.77108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096308.77276: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096308.77333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096308.77380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096308.77422: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096308.77495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096308.77543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096308.77571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096308.77651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096308.77661: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096308.77954: variable 'network_connections' from source: play vars 18911 1727096308.77965: variable 'profile' from source: play vars 18911 1727096308.78047: variable 'profile' from source: play vars 18911 1727096308.78058: variable 'interface' from source: set_fact 18911 1727096308.78130: variable 'interface' from source: set_fact 18911 1727096308.78164: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096308.78255: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096308.78625: variable 'network_connections' from source: play vars 18911 1727096308.78630: variable 'profile' from source: play vars 18911 1727096308.78657: variable 'profile' from source: play vars 18911 1727096308.78669: variable 'interface' from source: set_fact 18911 1727096308.78751: variable 'interface' from source: set_fact 18911 1727096308.78785: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096308.79174: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096308.79520: variable 'network_connections' from source: play vars 18911 1727096308.79580: variable 'profile' from source: play vars 18911 1727096308.79653: variable 'profile' from source: play vars 18911 1727096308.79725: variable 'interface' from source: set_fact 18911 1727096308.79825: variable 'interface' from source: set_fact 18911 1727096308.79985: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096308.80263: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096308.80266: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096308.80270: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096308.80741: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096308.81444: variable 'network_connections' from source: play vars 18911 1727096308.81458: variable 'profile' from source: play vars 18911 1727096308.81526: variable 'profile' from source: play vars 18911 1727096308.81545: variable 'interface' from source: set_fact 18911 1727096308.81674: variable 'interface' from source: set_fact 18911 1727096308.81677: variable 'ansible_distribution' from source: facts 18911 1727096308.81685: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.81688: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.81690: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096308.81853: variable 'ansible_distribution' from source: facts 18911 1727096308.81862: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.81874: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.81892: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096308.82075: variable 'ansible_distribution' from source: facts 18911 1727096308.82085: variable '__network_rh_distros' from source: role '' defaults 18911 1727096308.82119: variable 'ansible_distribution_major_version' from source: facts 18911 1727096308.82146: variable 'network_provider' from source: set_fact 18911 1727096308.82176: variable 'omit' from source: magic vars 18911 1727096308.82210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096308.82273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096308.82288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096308.82336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096308.82344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096308.82375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096308.82444: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.82449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.82506: Set connection var ansible_shell_executable to /bin/sh 18911 1727096308.82518: Set connection var ansible_timeout to 10 18911 1727096308.82553: Set connection var ansible_shell_type to sh 18911 1727096308.82556: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096308.82563: Set connection var ansible_pipelining to False 18911 1727096308.82566: Set connection var ansible_connection to ssh 18911 1727096308.82598: variable 'ansible_shell_executable' from source: unknown 18911 1727096308.82607: variable 'ansible_connection' from source: unknown 18911 1727096308.82658: variable 'ansible_module_compression' from source: unknown 18911 1727096308.82671: variable 'ansible_shell_type' from source: unknown 18911 1727096308.82675: variable 'ansible_shell_executable' from source: unknown 18911 1727096308.82677: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096308.82683: variable 'ansible_pipelining' from source: unknown 18911 1727096308.82686: variable 'ansible_timeout' from source: unknown 18911 1727096308.82688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096308.82800: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096308.82816: variable 'omit' from source: magic vars 18911 1727096308.82882: starting attempt loop 18911 1727096308.82887: running the handler 18911 1727096308.82972: variable 'ansible_facts' from source: unknown 18911 1727096308.83794: _low_level_execute_command(): starting 18911 1727096308.83807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096308.84637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096308.84681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096308.84706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096308.84865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096308.86582: stdout chunk (state=3): >>>/root <<< 18911 1727096308.86683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096308.86773: stderr chunk (state=3): >>><<< 18911 1727096308.86777: stdout chunk (state=3): >>><<< 18911 1727096308.86895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096308.86898: _low_level_execute_command(): starting 18911 1727096308.86901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510 `" && echo ansible-tmp-1727096308.868144-20268-256791082850510="` echo /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510 `" ) && sleep 0' 18911 1727096308.87720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096308.87723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096308.87778: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096308.87825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096308.87884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096308.89876: stdout chunk (state=3): >>>ansible-tmp-1727096308.868144-20268-256791082850510=/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510 <<< 18911 1727096308.90032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096308.90036: stdout chunk (state=3): >>><<< 18911 1727096308.90038: stderr chunk (state=3): >>><<< 18911 1727096308.90173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096308.868144-20268-256791082850510=/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096308.90176: variable 'ansible_module_compression' from source: unknown 18911 1727096308.90179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18911 1727096308.90208: variable 'ansible_facts' from source: unknown 18911 1727096308.90443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py 18911 1727096308.90647: Sending initial data 18911 1727096308.90650: Sent initial data (155 bytes) 18911 1727096308.91362: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096308.91831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096308.91835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096308.91837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096308.92158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096308.93786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096308.93917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096308.93921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpkfu_avtd /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py <<< 18911 1727096308.93938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py" <<< 18911 1727096308.94003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpkfu_avtd" to remote "/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py" <<< 18911 1727096308.96984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096308.96989: stderr chunk (state=3): >>><<< 18911 1727096308.96992: stdout chunk (state=3): >>><<< 18911 1727096308.97050: done transferring module to remote 18911 1727096308.97060: _low_level_execute_command(): starting 18911 1727096308.97067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/ /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py && sleep 0' 18911 1727096308.98335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096308.98340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096308.98342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096308.98344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096308.98347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096308.98349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096308.98465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096308.98499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096308.98606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.00711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.00715: stdout chunk (state=3): >>><<< 18911 1727096309.00717: stderr chunk (state=3): >>><<< 18911 1727096309.00720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096309.00723: _low_level_execute_command(): starting 18911 1727096309.00725: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/AnsiballZ_systemd.py && sleep 0' 18911 1727096309.02021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096309.02082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.02120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.02251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.02281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.02470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.31988: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311579136", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "888734000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18911 1727096309.32027: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18911 1727096309.34000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096309.34005: stdout chunk (state=3): >>><<< 18911 1727096309.34011: stderr chunk (state=3): >>><<< 18911 1727096309.34033: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311579136", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "888734000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096309.34202: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096309.34222: _low_level_execute_command(): starting 18911 1727096309.34225: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096308.868144-20268-256791082850510/ > /dev/null 2>&1 && sleep 0' 18911 1727096309.34877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096309.34881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096309.34883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096309.34902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096309.34911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096309.34919: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096309.34928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.34943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096309.34951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096309.34958: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18911 1727096309.34972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096309.34982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096309.34994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096309.35001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096309.35016: stderr chunk (state=3): >>>debug2: match found <<< 18911 1727096309.35019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.35090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.35100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.35139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.35246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.37201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.37205: stdout chunk (state=3): >>><<< 18911 1727096309.37208: stderr chunk (state=3): >>><<< 18911 1727096309.37224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096309.37236: handler run complete 18911 1727096309.37373: attempt loop complete, returning result 18911 1727096309.37376: _execute() done 18911 1727096309.37379: dumping result to json 18911 1727096309.37381: done dumping result, returning 18911 1727096309.37383: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-09a7-aae1-000000000048] 18911 1727096309.37386: sending task result for task 0afff68d-5257-09a7-aae1-000000000048 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096309.37823: no more pending results, returning what we have 18911 1727096309.37826: results queue empty 18911 1727096309.37827: checking for any_errors_fatal 18911 1727096309.37833: done checking for any_errors_fatal 18911 1727096309.37834: checking for max_fail_percentage 18911 1727096309.37835: done checking for max_fail_percentage 18911 1727096309.37836: checking to see if all hosts have failed and the running result is not ok 18911 1727096309.37837: done checking to see if all hosts have failed 18911 1727096309.37839: getting the remaining hosts for this loop 18911 1727096309.37840: done getting the remaining hosts for this loop 18911 1727096309.37844: getting the next task for host managed_node1 18911 1727096309.37851: done getting next task for host managed_node1 18911 1727096309.37855: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096309.37857: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096309.37981: getting variables 18911 1727096309.37984: in VariableManager get_vars() 18911 1727096309.38017: Calling all_inventory to load vars for managed_node1 18911 1727096309.38019: Calling groups_inventory to load vars for managed_node1 18911 1727096309.38021: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096309.38032: Calling all_plugins_play to load vars for managed_node1 18911 1727096309.38034: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096309.38037: Calling groups_plugins_play to load vars for managed_node1 18911 1727096309.38584: done sending task result for task 0afff68d-5257-09a7-aae1-000000000048 18911 1727096309.38587: WORKER PROCESS EXITING 18911 1727096309.39691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096309.41724: done with get_vars() 18911 1727096309.41748: done getting variables 18911 1727096309.41818: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:58:29 -0400 (0:00:00.721) 0:00:28.532 ****** 18911 1727096309.41856: entering _queue_task() for managed_node1/service 18911 1727096309.42232: worker is 1 (out of 1 available) 18911 1727096309.42245: exiting _queue_task() for managed_node1/service 18911 1727096309.42262: done queuing things up, now waiting for results queue to drain 18911 1727096309.42264: waiting for pending results... 18911 1727096309.42546: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096309.42658: in run() - task 0afff68d-5257-09a7-aae1-000000000049 18911 1727096309.42675: variable 'ansible_search_path' from source: unknown 18911 1727096309.42679: variable 'ansible_search_path' from source: unknown 18911 1727096309.42721: calling self._execute() 18911 1727096309.42826: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.42832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.42843: variable 'omit' from source: magic vars 18911 1727096309.43249: variable 'ansible_distribution_major_version' from source: facts 18911 1727096309.43275: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096309.43380: variable 'network_provider' from source: set_fact 18911 1727096309.43456: Evaluated conditional (network_provider == "nm"): True 18911 1727096309.43480: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096309.43564: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096309.43728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096309.46083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096309.46160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096309.46276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096309.46280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096309.46289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096309.46398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096309.46428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096309.46475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096309.46503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096309.46518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096309.46566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096309.46600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096309.46626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096309.46665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096309.46684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096309.46735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096309.46758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096309.46788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096309.46830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096309.46845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096309.47000: variable 'network_connections' from source: play vars 18911 1727096309.47014: variable 'profile' from source: play vars 18911 1727096309.47153: variable 'profile' from source: play vars 18911 1727096309.47158: variable 'interface' from source: set_fact 18911 1727096309.47162: variable 'interface' from source: set_fact 18911 1727096309.47237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096309.47414: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096309.47449: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096309.47588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096309.47593: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096309.47596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096309.47598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096309.47610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096309.47636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096309.47695: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096309.47956: variable 'network_connections' from source: play vars 18911 1727096309.47960: variable 'profile' from source: play vars 18911 1727096309.48030: variable 'profile' from source: play vars 18911 1727096309.48172: variable 'interface' from source: set_fact 18911 1727096309.48177: variable 'interface' from source: set_fact 18911 1727096309.48179: Evaluated conditional (__network_wpa_supplicant_required): False 18911 1727096309.48182: when evaluation is False, skipping this task 18911 1727096309.48184: _execute() done 18911 1727096309.48195: dumping result to json 18911 1727096309.48198: done dumping result, returning 18911 1727096309.48200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-09a7-aae1-000000000049] 18911 1727096309.48202: sending task result for task 0afff68d-5257-09a7-aae1-000000000049 18911 1727096309.48261: done sending task result for task 0afff68d-5257-09a7-aae1-000000000049 18911 1727096309.48264: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18911 1727096309.48312: no more pending results, returning what we have 18911 1727096309.48315: results queue empty 18911 1727096309.48316: checking for any_errors_fatal 18911 1727096309.48335: done checking for any_errors_fatal 18911 1727096309.48336: checking for max_fail_percentage 18911 1727096309.48338: done checking for max_fail_percentage 18911 1727096309.48338: checking to see if all hosts have failed and the running result is not ok 18911 1727096309.48339: done checking to see if all hosts have failed 18911 1727096309.48340: getting the remaining hosts for this loop 18911 1727096309.48341: done getting the remaining hosts for this loop 18911 1727096309.48344: getting the next task for host managed_node1 18911 1727096309.48352: done getting next task for host managed_node1 18911 1727096309.48355: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096309.48357: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096309.48375: getting variables 18911 1727096309.48376: in VariableManager get_vars() 18911 1727096309.48414: Calling all_inventory to load vars for managed_node1 18911 1727096309.48417: Calling groups_inventory to load vars for managed_node1 18911 1727096309.48419: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096309.48428: Calling all_plugins_play to load vars for managed_node1 18911 1727096309.48431: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096309.48434: Calling groups_plugins_play to load vars for managed_node1 18911 1727096309.49891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096309.51681: done with get_vars() 18911 1727096309.51707: done getting variables 18911 1727096309.51777: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:58:29 -0400 (0:00:00.099) 0:00:28.632 ****** 18911 1727096309.51807: entering _queue_task() for managed_node1/service 18911 1727096309.52151: worker is 1 (out of 1 available) 18911 1727096309.52170: exiting _queue_task() for managed_node1/service 18911 1727096309.52182: done queuing things up, now waiting for results queue to drain 18911 1727096309.52183: waiting for pending results... 18911 1727096309.52589: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096309.52597: in run() - task 0afff68d-5257-09a7-aae1-00000000004a 18911 1727096309.52601: variable 'ansible_search_path' from source: unknown 18911 1727096309.52604: variable 'ansible_search_path' from source: unknown 18911 1727096309.52631: calling self._execute() 18911 1727096309.52732: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.52745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.52760: variable 'omit' from source: magic vars 18911 1727096309.53173: variable 'ansible_distribution_major_version' from source: facts 18911 1727096309.53192: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096309.53327: variable 'network_provider' from source: set_fact 18911 1727096309.53340: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096309.53440: when evaluation is False, skipping this task 18911 1727096309.53444: _execute() done 18911 1727096309.53447: dumping result to json 18911 1727096309.53449: done dumping result, returning 18911 1727096309.53452: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-09a7-aae1-00000000004a] 18911 1727096309.53454: sending task result for task 0afff68d-5257-09a7-aae1-00000000004a 18911 1727096309.53525: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004a 18911 1727096309.53529: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096309.53577: no more pending results, returning what we have 18911 1727096309.53580: results queue empty 18911 1727096309.53580: checking for any_errors_fatal 18911 1727096309.53589: done checking for any_errors_fatal 18911 1727096309.53589: checking for max_fail_percentage 18911 1727096309.53591: done checking for max_fail_percentage 18911 1727096309.53592: checking to see if all hosts have failed and the running result is not ok 18911 1727096309.53593: done checking to see if all hosts have failed 18911 1727096309.53593: getting the remaining hosts for this loop 18911 1727096309.53595: done getting the remaining hosts for this loop 18911 1727096309.53598: getting the next task for host managed_node1 18911 1727096309.53605: done getting next task for host managed_node1 18911 1727096309.53608: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096309.53610: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096309.53624: getting variables 18911 1727096309.53626: in VariableManager get_vars() 18911 1727096309.53662: Calling all_inventory to load vars for managed_node1 18911 1727096309.53668: Calling groups_inventory to load vars for managed_node1 18911 1727096309.53671: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096309.53681: Calling all_plugins_play to load vars for managed_node1 18911 1727096309.53684: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096309.53687: Calling groups_plugins_play to load vars for managed_node1 18911 1727096309.55259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096309.57037: done with get_vars() 18911 1727096309.57058: done getting variables 18911 1727096309.57124: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:58:29 -0400 (0:00:00.053) 0:00:28.685 ****** 18911 1727096309.57157: entering _queue_task() for managed_node1/copy 18911 1727096309.57494: worker is 1 (out of 1 available) 18911 1727096309.57509: exiting _queue_task() for managed_node1/copy 18911 1727096309.57521: done queuing things up, now waiting for results queue to drain 18911 1727096309.57523: waiting for pending results... 18911 1727096309.57870: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096309.57890: in run() - task 0afff68d-5257-09a7-aae1-00000000004b 18911 1727096309.57916: variable 'ansible_search_path' from source: unknown 18911 1727096309.57926: variable 'ansible_search_path' from source: unknown 18911 1727096309.57970: calling self._execute() 18911 1727096309.58082: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.58094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.58115: variable 'omit' from source: magic vars 18911 1727096309.58541: variable 'ansible_distribution_major_version' from source: facts 18911 1727096309.58564: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096309.58690: variable 'network_provider' from source: set_fact 18911 1727096309.58757: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096309.58761: when evaluation is False, skipping this task 18911 1727096309.58764: _execute() done 18911 1727096309.58766: dumping result to json 18911 1727096309.58771: done dumping result, returning 18911 1727096309.58774: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-09a7-aae1-00000000004b] 18911 1727096309.58777: sending task result for task 0afff68d-5257-09a7-aae1-00000000004b skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18911 1727096309.58909: no more pending results, returning what we have 18911 1727096309.58913: results queue empty 18911 1727096309.58914: checking for any_errors_fatal 18911 1727096309.58918: done checking for any_errors_fatal 18911 1727096309.58919: checking for max_fail_percentage 18911 1727096309.58920: done checking for max_fail_percentage 18911 1727096309.58921: checking to see if all hosts have failed and the running result is not ok 18911 1727096309.58922: done checking to see if all hosts have failed 18911 1727096309.58922: getting the remaining hosts for this loop 18911 1727096309.58924: done getting the remaining hosts for this loop 18911 1727096309.58927: getting the next task for host managed_node1 18911 1727096309.58934: done getting next task for host managed_node1 18911 1727096309.58937: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096309.58939: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096309.58951: getting variables 18911 1727096309.58953: in VariableManager get_vars() 18911 1727096309.58994: Calling all_inventory to load vars for managed_node1 18911 1727096309.58997: Calling groups_inventory to load vars for managed_node1 18911 1727096309.58999: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096309.59011: Calling all_plugins_play to load vars for managed_node1 18911 1727096309.59014: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096309.59017: Calling groups_plugins_play to load vars for managed_node1 18911 1727096309.59540: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004b 18911 1727096309.59543: WORKER PROCESS EXITING 18911 1727096309.60471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096309.62315: done with get_vars() 18911 1727096309.62339: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:58:29 -0400 (0:00:00.052) 0:00:28.738 ****** 18911 1727096309.62429: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096309.62809: worker is 1 (out of 1 available) 18911 1727096309.62822: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096309.62834: done queuing things up, now waiting for results queue to drain 18911 1727096309.62835: waiting for pending results... 18911 1727096309.63106: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096309.63225: in run() - task 0afff68d-5257-09a7-aae1-00000000004c 18911 1727096309.63246: variable 'ansible_search_path' from source: unknown 18911 1727096309.63255: variable 'ansible_search_path' from source: unknown 18911 1727096309.63306: calling self._execute() 18911 1727096309.63475: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.63479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.63482: variable 'omit' from source: magic vars 18911 1727096309.63857: variable 'ansible_distribution_major_version' from source: facts 18911 1727096309.63878: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096309.63889: variable 'omit' from source: magic vars 18911 1727096309.63939: variable 'omit' from source: magic vars 18911 1727096309.64111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096309.66634: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096309.66713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096309.66973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096309.66976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096309.66979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096309.66982: variable 'network_provider' from source: set_fact 18911 1727096309.67049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096309.67085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096309.67122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096309.67165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096309.67187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096309.67262: variable 'omit' from source: magic vars 18911 1727096309.67390: variable 'omit' from source: magic vars 18911 1727096309.67500: variable 'network_connections' from source: play vars 18911 1727096309.67514: variable 'profile' from source: play vars 18911 1727096309.67582: variable 'profile' from source: play vars 18911 1727096309.67591: variable 'interface' from source: set_fact 18911 1727096309.67655: variable 'interface' from source: set_fact 18911 1727096309.67802: variable 'omit' from source: magic vars 18911 1727096309.67816: variable '__lsr_ansible_managed' from source: task vars 18911 1727096309.67887: variable '__lsr_ansible_managed' from source: task vars 18911 1727096309.68171: Loaded config def from plugin (lookup/template) 18911 1727096309.68191: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18911 1727096309.68224: File lookup term: get_ansible_managed.j2 18911 1727096309.68232: variable 'ansible_search_path' from source: unknown 18911 1727096309.68244: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18911 1727096309.68261: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18911 1727096309.68295: variable 'ansible_search_path' from source: unknown 18911 1727096309.74577: variable 'ansible_managed' from source: unknown 18911 1727096309.74647: variable 'omit' from source: magic vars 18911 1727096309.74684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096309.74726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096309.74751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096309.74776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096309.74792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096309.74834: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096309.74843: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.74851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.74960: Set connection var ansible_shell_executable to /bin/sh 18911 1727096309.75172: Set connection var ansible_timeout to 10 18911 1727096309.75175: Set connection var ansible_shell_type to sh 18911 1727096309.75177: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096309.75179: Set connection var ansible_pipelining to False 18911 1727096309.75181: Set connection var ansible_connection to ssh 18911 1727096309.75183: variable 'ansible_shell_executable' from source: unknown 18911 1727096309.75185: variable 'ansible_connection' from source: unknown 18911 1727096309.75187: variable 'ansible_module_compression' from source: unknown 18911 1727096309.75189: variable 'ansible_shell_type' from source: unknown 18911 1727096309.75192: variable 'ansible_shell_executable' from source: unknown 18911 1727096309.75194: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096309.75195: variable 'ansible_pipelining' from source: unknown 18911 1727096309.75197: variable 'ansible_timeout' from source: unknown 18911 1727096309.75199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096309.75201: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096309.75225: variable 'omit' from source: magic vars 18911 1727096309.75236: starting attempt loop 18911 1727096309.75244: running the handler 18911 1727096309.75262: _low_level_execute_command(): starting 18911 1727096309.75276: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096309.76092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.76113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.76130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.76154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.76262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.77980: stdout chunk (state=3): >>>/root <<< 18911 1727096309.78310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.78313: stdout chunk (state=3): >>><<< 18911 1727096309.78315: stderr chunk (state=3): >>><<< 18911 1727096309.78318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096309.78320: _low_level_execute_command(): starting 18911 1727096309.78323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742 `" && echo ansible-tmp-1727096309.7828655-20311-131374957318742="` echo /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742 `" ) && sleep 0' 18911 1727096309.78955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096309.78973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096309.78988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096309.79003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096309.79018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096309.79052: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096309.79152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.79164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.79185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.79203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.79313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.81292: stdout chunk (state=3): >>>ansible-tmp-1727096309.7828655-20311-131374957318742=/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742 <<< 18911 1727096309.81575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.81579: stdout chunk (state=3): >>><<< 18911 1727096309.81582: stderr chunk (state=3): >>><<< 18911 1727096309.81585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096309.7828655-20311-131374957318742=/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096309.81587: variable 'ansible_module_compression' from source: unknown 18911 1727096309.81590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18911 1727096309.81625: variable 'ansible_facts' from source: unknown 18911 1727096309.81744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py 18911 1727096309.81951: Sending initial data 18911 1727096309.81954: Sent initial data (168 bytes) 18911 1727096309.82583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.82617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.82683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.82910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.82984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.84642: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096309.84684: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096309.84756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096309.84834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmptsdfclwy /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py <<< 18911 1727096309.84838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py" <<< 18911 1727096309.84921: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmptsdfclwy" to remote "/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py" <<< 18911 1727096309.85978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.86046: stderr chunk (state=3): >>><<< 18911 1727096309.86049: stdout chunk (state=3): >>><<< 18911 1727096309.86054: done transferring module to remote 18911 1727096309.86068: _low_level_execute_command(): starting 18911 1727096309.86075: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/ /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py && sleep 0' 18911 1727096309.86762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096309.86782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.86806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096309.86818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.86837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.86942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096309.88832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096309.88847: stderr chunk (state=3): >>><<< 18911 1727096309.88858: stdout chunk (state=3): >>><<< 18911 1727096309.88897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096309.88992: _low_level_execute_command(): starting 18911 1727096309.88996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/AnsiballZ_network_connections.py && sleep 0' 18911 1727096309.89584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096309.89630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096309.89643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096309.89744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096309.89774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096309.89879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.20478: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18911 1727096310.22674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096310.22683: stdout chunk (state=3): >>><<< 18911 1727096310.22686: stderr chunk (state=3): >>><<< 18911 1727096310.22688: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096310.22691: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096310.22693: _low_level_execute_command(): starting 18911 1727096310.22695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096309.7828655-20311-131374957318742/ > /dev/null 2>&1 && sleep 0' 18911 1727096310.23315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.23343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096310.23349: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096310.23381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096310.23450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096310.23466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.23505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.23585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.25519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.25523: stdout chunk (state=3): >>><<< 18911 1727096310.25526: stderr chunk (state=3): >>><<< 18911 1727096310.25551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096310.25672: handler run complete 18911 1727096310.25676: attempt loop complete, returning result 18911 1727096310.25678: _execute() done 18911 1727096310.25680: dumping result to json 18911 1727096310.25682: done dumping result, returning 18911 1727096310.25684: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-09a7-aae1-00000000004c] 18911 1727096310.25687: sending task result for task 0afff68d-5257-09a7-aae1-00000000004c 18911 1727096310.25760: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004c 18911 1727096310.25763: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18911 1727096310.25864: no more pending results, returning what we have 18911 1727096310.25870: results queue empty 18911 1727096310.25871: checking for any_errors_fatal 18911 1727096310.25879: done checking for any_errors_fatal 18911 1727096310.25879: checking for max_fail_percentage 18911 1727096310.25881: done checking for max_fail_percentage 18911 1727096310.25882: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.25883: done checking to see if all hosts have failed 18911 1727096310.25884: getting the remaining hosts for this loop 18911 1727096310.25885: done getting the remaining hosts for this loop 18911 1727096310.25895: getting the next task for host managed_node1 18911 1727096310.25903: done getting next task for host managed_node1 18911 1727096310.25906: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096310.25908: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.25919: getting variables 18911 1727096310.25921: in VariableManager get_vars() 18911 1727096310.25960: Calling all_inventory to load vars for managed_node1 18911 1727096310.25963: Calling groups_inventory to load vars for managed_node1 18911 1727096310.25965: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.26182: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.26194: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.26198: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.28014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.29788: done with get_vars() 18911 1727096310.29814: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:58:30 -0400 (0:00:00.674) 0:00:29.413 ****** 18911 1727096310.29903: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096310.30245: worker is 1 (out of 1 available) 18911 1727096310.30259: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096310.30275: done queuing things up, now waiting for results queue to drain 18911 1727096310.30277: waiting for pending results... 18911 1727096310.30689: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096310.30695: in run() - task 0afff68d-5257-09a7-aae1-00000000004d 18911 1727096310.30700: variable 'ansible_search_path' from source: unknown 18911 1727096310.30704: variable 'ansible_search_path' from source: unknown 18911 1727096310.30781: calling self._execute() 18911 1727096310.30831: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.30836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.30966: variable 'omit' from source: magic vars 18911 1727096310.31256: variable 'ansible_distribution_major_version' from source: facts 18911 1727096310.31279: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096310.31401: variable 'network_state' from source: role '' defaults 18911 1727096310.31412: Evaluated conditional (network_state != {}): False 18911 1727096310.31415: when evaluation is False, skipping this task 18911 1727096310.31418: _execute() done 18911 1727096310.31421: dumping result to json 18911 1727096310.31423: done dumping result, returning 18911 1727096310.31430: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-09a7-aae1-00000000004d] 18911 1727096310.31435: sending task result for task 0afff68d-5257-09a7-aae1-00000000004d 18911 1727096310.31533: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004d 18911 1727096310.31536: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096310.31597: no more pending results, returning what we have 18911 1727096310.31602: results queue empty 18911 1727096310.31603: checking for any_errors_fatal 18911 1727096310.31614: done checking for any_errors_fatal 18911 1727096310.31615: checking for max_fail_percentage 18911 1727096310.31617: done checking for max_fail_percentage 18911 1727096310.31618: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.31619: done checking to see if all hosts have failed 18911 1727096310.31620: getting the remaining hosts for this loop 18911 1727096310.31621: done getting the remaining hosts for this loop 18911 1727096310.31625: getting the next task for host managed_node1 18911 1727096310.31633: done getting next task for host managed_node1 18911 1727096310.31637: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096310.31640: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.31657: getting variables 18911 1727096310.31659: in VariableManager get_vars() 18911 1727096310.31704: Calling all_inventory to load vars for managed_node1 18911 1727096310.31708: Calling groups_inventory to load vars for managed_node1 18911 1727096310.31710: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.31722: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.31729: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.31733: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.33310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.34859: done with get_vars() 18911 1727096310.34887: done getting variables 18911 1727096310.34940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:58:30 -0400 (0:00:00.050) 0:00:29.464 ****** 18911 1727096310.34973: entering _queue_task() for managed_node1/debug 18911 1727096310.35309: worker is 1 (out of 1 available) 18911 1727096310.35321: exiting _queue_task() for managed_node1/debug 18911 1727096310.35333: done queuing things up, now waiting for results queue to drain 18911 1727096310.35335: waiting for pending results... 18911 1727096310.35598: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096310.35756: in run() - task 0afff68d-5257-09a7-aae1-00000000004e 18911 1727096310.35760: variable 'ansible_search_path' from source: unknown 18911 1727096310.35762: variable 'ansible_search_path' from source: unknown 18911 1727096310.35765: calling self._execute() 18911 1727096310.35904: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.35912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.35924: variable 'omit' from source: magic vars 18911 1727096310.36421: variable 'ansible_distribution_major_version' from source: facts 18911 1727096310.36425: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096310.36428: variable 'omit' from source: magic vars 18911 1727096310.36431: variable 'omit' from source: magic vars 18911 1727096310.36461: variable 'omit' from source: magic vars 18911 1727096310.36506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096310.36540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096310.36565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096310.36588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.36600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.36636: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096310.36641: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.36643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.36850: Set connection var ansible_shell_executable to /bin/sh 18911 1727096310.36854: Set connection var ansible_timeout to 10 18911 1727096310.36856: Set connection var ansible_shell_type to sh 18911 1727096310.36859: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096310.36861: Set connection var ansible_pipelining to False 18911 1727096310.36863: Set connection var ansible_connection to ssh 18911 1727096310.36864: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.36866: variable 'ansible_connection' from source: unknown 18911 1727096310.36870: variable 'ansible_module_compression' from source: unknown 18911 1727096310.36872: variable 'ansible_shell_type' from source: unknown 18911 1727096310.36874: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.36876: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.36879: variable 'ansible_pipelining' from source: unknown 18911 1727096310.36881: variable 'ansible_timeout' from source: unknown 18911 1727096310.36884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.37000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096310.37004: variable 'omit' from source: magic vars 18911 1727096310.37007: starting attempt loop 18911 1727096310.37010: running the handler 18911 1727096310.37175: variable '__network_connections_result' from source: set_fact 18911 1727096310.37195: handler run complete 18911 1727096310.37215: attempt loop complete, returning result 18911 1727096310.37219: _execute() done 18911 1727096310.37222: dumping result to json 18911 1727096310.37226: done dumping result, returning 18911 1727096310.37284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-09a7-aae1-00000000004e] 18911 1727096310.37288: sending task result for task 0afff68d-5257-09a7-aae1-00000000004e 18911 1727096310.37424: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004e 18911 1727096310.37433: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18911 1727096310.37504: no more pending results, returning what we have 18911 1727096310.37508: results queue empty 18911 1727096310.37509: checking for any_errors_fatal 18911 1727096310.37515: done checking for any_errors_fatal 18911 1727096310.37516: checking for max_fail_percentage 18911 1727096310.37517: done checking for max_fail_percentage 18911 1727096310.37518: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.37519: done checking to see if all hosts have failed 18911 1727096310.37520: getting the remaining hosts for this loop 18911 1727096310.37521: done getting the remaining hosts for this loop 18911 1727096310.37525: getting the next task for host managed_node1 18911 1727096310.37532: done getting next task for host managed_node1 18911 1727096310.37536: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096310.37538: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.37549: getting variables 18911 1727096310.37551: in VariableManager get_vars() 18911 1727096310.37595: Calling all_inventory to load vars for managed_node1 18911 1727096310.37598: Calling groups_inventory to load vars for managed_node1 18911 1727096310.37601: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.37612: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.37615: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.37618: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.39232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.40776: done with get_vars() 18911 1727096310.40802: done getting variables 18911 1727096310.40855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:58:30 -0400 (0:00:00.059) 0:00:29.523 ****** 18911 1727096310.40890: entering _queue_task() for managed_node1/debug 18911 1727096310.41240: worker is 1 (out of 1 available) 18911 1727096310.41254: exiting _queue_task() for managed_node1/debug 18911 1727096310.41270: done queuing things up, now waiting for results queue to drain 18911 1727096310.41271: waiting for pending results... 18911 1727096310.41510: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096310.41656: in run() - task 0afff68d-5257-09a7-aae1-00000000004f 18911 1727096310.41660: variable 'ansible_search_path' from source: unknown 18911 1727096310.41662: variable 'ansible_search_path' from source: unknown 18911 1727096310.41875: calling self._execute() 18911 1727096310.41879: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.41883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.41886: variable 'omit' from source: magic vars 18911 1727096310.42282: variable 'ansible_distribution_major_version' from source: facts 18911 1727096310.42294: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096310.42303: variable 'omit' from source: magic vars 18911 1727096310.42341: variable 'omit' from source: magic vars 18911 1727096310.42391: variable 'omit' from source: magic vars 18911 1727096310.42431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096310.42475: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096310.42528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096310.42532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.42534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.42557: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096310.42564: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.42575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.42746: Set connection var ansible_shell_executable to /bin/sh 18911 1727096310.42750: Set connection var ansible_timeout to 10 18911 1727096310.42753: Set connection var ansible_shell_type to sh 18911 1727096310.42755: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096310.42758: Set connection var ansible_pipelining to False 18911 1727096310.42760: Set connection var ansible_connection to ssh 18911 1727096310.42762: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.42763: variable 'ansible_connection' from source: unknown 18911 1727096310.42766: variable 'ansible_module_compression' from source: unknown 18911 1727096310.42771: variable 'ansible_shell_type' from source: unknown 18911 1727096310.42773: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.42775: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.42781: variable 'ansible_pipelining' from source: unknown 18911 1727096310.42783: variable 'ansible_timeout' from source: unknown 18911 1727096310.42789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.42910: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096310.42921: variable 'omit' from source: magic vars 18911 1727096310.42965: starting attempt loop 18911 1727096310.42970: running the handler 18911 1727096310.42984: variable '__network_connections_result' from source: set_fact 18911 1727096310.43074: variable '__network_connections_result' from source: set_fact 18911 1727096310.43182: handler run complete 18911 1727096310.43202: attempt loop complete, returning result 18911 1727096310.43206: _execute() done 18911 1727096310.43208: dumping result to json 18911 1727096310.43216: done dumping result, returning 18911 1727096310.43225: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-09a7-aae1-00000000004f] 18911 1727096310.43230: sending task result for task 0afff68d-5257-09a7-aae1-00000000004f 18911 1727096310.43431: done sending task result for task 0afff68d-5257-09a7-aae1-00000000004f 18911 1727096310.43437: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18911 1727096310.43530: no more pending results, returning what we have 18911 1727096310.43534: results queue empty 18911 1727096310.43535: checking for any_errors_fatal 18911 1727096310.43545: done checking for any_errors_fatal 18911 1727096310.43546: checking for max_fail_percentage 18911 1727096310.43548: done checking for max_fail_percentage 18911 1727096310.43549: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.43550: done checking to see if all hosts have failed 18911 1727096310.43550: getting the remaining hosts for this loop 18911 1727096310.43552: done getting the remaining hosts for this loop 18911 1727096310.43556: getting the next task for host managed_node1 18911 1727096310.43565: done getting next task for host managed_node1 18911 1727096310.43571: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096310.43573: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.43584: getting variables 18911 1727096310.43586: in VariableManager get_vars() 18911 1727096310.43625: Calling all_inventory to load vars for managed_node1 18911 1727096310.43629: Calling groups_inventory to load vars for managed_node1 18911 1727096310.43631: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.43642: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.43645: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.43649: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.45155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.46839: done with get_vars() 18911 1727096310.46859: done getting variables 18911 1727096310.46919: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:58:30 -0400 (0:00:00.060) 0:00:29.583 ****** 18911 1727096310.46955: entering _queue_task() for managed_node1/debug 18911 1727096310.47293: worker is 1 (out of 1 available) 18911 1727096310.47306: exiting _queue_task() for managed_node1/debug 18911 1727096310.47318: done queuing things up, now waiting for results queue to drain 18911 1727096310.47320: waiting for pending results... 18911 1727096310.47745: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096310.47751: in run() - task 0afff68d-5257-09a7-aae1-000000000050 18911 1727096310.47754: variable 'ansible_search_path' from source: unknown 18911 1727096310.47757: variable 'ansible_search_path' from source: unknown 18911 1727096310.47759: calling self._execute() 18911 1727096310.47867: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.47876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.47888: variable 'omit' from source: magic vars 18911 1727096310.48299: variable 'ansible_distribution_major_version' from source: facts 18911 1727096310.48309: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096310.48440: variable 'network_state' from source: role '' defaults 18911 1727096310.48453: Evaluated conditional (network_state != {}): False 18911 1727096310.48457: when evaluation is False, skipping this task 18911 1727096310.48461: _execute() done 18911 1727096310.48464: dumping result to json 18911 1727096310.48517: done dumping result, returning 18911 1727096310.48522: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-09a7-aae1-000000000050] 18911 1727096310.48525: sending task result for task 0afff68d-5257-09a7-aae1-000000000050 18911 1727096310.48591: done sending task result for task 0afff68d-5257-09a7-aae1-000000000050 18911 1727096310.48594: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18911 1727096310.48647: no more pending results, returning what we have 18911 1727096310.48652: results queue empty 18911 1727096310.48653: checking for any_errors_fatal 18911 1727096310.48661: done checking for any_errors_fatal 18911 1727096310.48664: checking for max_fail_percentage 18911 1727096310.48669: done checking for max_fail_percentage 18911 1727096310.48669: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.48670: done checking to see if all hosts have failed 18911 1727096310.48671: getting the remaining hosts for this loop 18911 1727096310.48673: done getting the remaining hosts for this loop 18911 1727096310.48677: getting the next task for host managed_node1 18911 1727096310.48685: done getting next task for host managed_node1 18911 1727096310.48688: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096310.48691: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.48707: getting variables 18911 1727096310.48709: in VariableManager get_vars() 18911 1727096310.48747: Calling all_inventory to load vars for managed_node1 18911 1727096310.48751: Calling groups_inventory to load vars for managed_node1 18911 1727096310.48753: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.48971: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.48976: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.48980: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.50318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.51892: done with get_vars() 18911 1727096310.51916: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:58:30 -0400 (0:00:00.050) 0:00:29.634 ****** 18911 1727096310.52003: entering _queue_task() for managed_node1/ping 18911 1727096310.52320: worker is 1 (out of 1 available) 18911 1727096310.52333: exiting _queue_task() for managed_node1/ping 18911 1727096310.52344: done queuing things up, now waiting for results queue to drain 18911 1727096310.52346: waiting for pending results... 18911 1727096310.52688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096310.52785: in run() - task 0afff68d-5257-09a7-aae1-000000000051 18911 1727096310.52790: variable 'ansible_search_path' from source: unknown 18911 1727096310.52792: variable 'ansible_search_path' from source: unknown 18911 1727096310.52795: calling self._execute() 18911 1727096310.52866: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.52876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.52888: variable 'omit' from source: magic vars 18911 1727096310.53274: variable 'ansible_distribution_major_version' from source: facts 18911 1727096310.53284: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096310.53434: variable 'omit' from source: magic vars 18911 1727096310.53438: variable 'omit' from source: magic vars 18911 1727096310.53441: variable 'omit' from source: magic vars 18911 1727096310.53444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096310.53463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096310.53491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096310.53509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.53520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096310.53552: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096310.53556: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.53558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.53676: Set connection var ansible_shell_executable to /bin/sh 18911 1727096310.53688: Set connection var ansible_timeout to 10 18911 1727096310.53691: Set connection var ansible_shell_type to sh 18911 1727096310.53698: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096310.53703: Set connection var ansible_pipelining to False 18911 1727096310.53709: Set connection var ansible_connection to ssh 18911 1727096310.53732: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.53735: variable 'ansible_connection' from source: unknown 18911 1727096310.53738: variable 'ansible_module_compression' from source: unknown 18911 1727096310.53740: variable 'ansible_shell_type' from source: unknown 18911 1727096310.53742: variable 'ansible_shell_executable' from source: unknown 18911 1727096310.53744: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096310.53749: variable 'ansible_pipelining' from source: unknown 18911 1727096310.53752: variable 'ansible_timeout' from source: unknown 18911 1727096310.53756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096310.53986: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096310.53996: variable 'omit' from source: magic vars 18911 1727096310.54002: starting attempt loop 18911 1727096310.54004: running the handler 18911 1727096310.54028: _low_level_execute_command(): starting 18911 1727096310.54034: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096310.54778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.54791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096310.54802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096310.54818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096310.54831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096310.54839: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096310.54853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096310.54871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096310.54960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.54972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.55081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.56873: stdout chunk (state=3): >>>/root <<< 18911 1727096310.57113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.57187: stderr chunk (state=3): >>><<< 18911 1727096310.57190: stdout chunk (state=3): >>><<< 18911 1727096310.57202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096310.57219: _low_level_execute_command(): starting 18911 1727096310.57223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214 `" && echo ansible-tmp-1727096310.5720365-20349-187319892643214="` echo /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214 `" ) && sleep 0' 18911 1727096310.57841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.57850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096310.57860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096310.57879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096310.57890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096310.57898: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096310.57985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.58004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.58143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.60049: stdout chunk (state=3): >>>ansible-tmp-1727096310.5720365-20349-187319892643214=/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214 <<< 18911 1727096310.60151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.60233: stderr chunk (state=3): >>><<< 18911 1727096310.60237: stdout chunk (state=3): >>><<< 18911 1727096310.60257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096310.5720365-20349-187319892643214=/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096310.60428: variable 'ansible_module_compression' from source: unknown 18911 1727096310.60432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18911 1727096310.60434: variable 'ansible_facts' from source: unknown 18911 1727096310.60503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py 18911 1727096310.60674: Sending initial data 18911 1727096310.60684: Sent initial data (153 bytes) 18911 1727096310.61379: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.61425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096310.61530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.61581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.61645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.63278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096310.63371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096310.63442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpxzwsapqb /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py <<< 18911 1727096310.63445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py" <<< 18911 1727096310.63500: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpxzwsapqb" to remote "/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py" <<< 18911 1727096310.64337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.64378: stderr chunk (state=3): >>><<< 18911 1727096310.64389: stdout chunk (state=3): >>><<< 18911 1727096310.64540: done transferring module to remote 18911 1727096310.64543: _low_level_execute_command(): starting 18911 1727096310.64546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/ /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py && sleep 0' 18911 1727096310.65124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.65142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096310.65158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096310.65186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096310.65224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096310.65286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096310.65338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096310.65355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.65382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.65491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.67474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.67478: stdout chunk (state=3): >>><<< 18911 1727096310.67481: stderr chunk (state=3): >>><<< 18911 1727096310.67483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096310.67485: _low_level_execute_command(): starting 18911 1727096310.67487: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/AnsiballZ_ping.py && sleep 0' 18911 1727096310.68095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.68103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096310.68114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096310.68185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096310.68272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096310.68276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.68278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.68369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.83776: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18911 1727096310.84991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096310.84995: stdout chunk (state=3): >>><<< 18911 1727096310.84998: stderr chunk (state=3): >>><<< 18911 1727096310.85021: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096310.85044: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096310.85173: _low_level_execute_command(): starting 18911 1727096310.85177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096310.5720365-20349-187319892643214/ > /dev/null 2>&1 && sleep 0' 18911 1727096310.86317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096310.86332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096310.86345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096310.86360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096310.86382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096310.86544: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096310.86644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096310.86888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096310.86984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096310.88883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096310.88909: stderr chunk (state=3): >>><<< 18911 1727096310.88920: stdout chunk (state=3): >>><<< 18911 1727096310.88942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096310.88955: handler run complete 18911 1727096310.88983: attempt loop complete, returning result 18911 1727096310.89017: _execute() done 18911 1727096310.89026: dumping result to json 18911 1727096310.89033: done dumping result, returning 18911 1727096310.89083: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-09a7-aae1-000000000051] 18911 1727096310.89092: sending task result for task 0afff68d-5257-09a7-aae1-000000000051 18911 1727096310.89542: done sending task result for task 0afff68d-5257-09a7-aae1-000000000051 18911 1727096310.89546: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18911 1727096310.89607: no more pending results, returning what we have 18911 1727096310.89610: results queue empty 18911 1727096310.89611: checking for any_errors_fatal 18911 1727096310.89618: done checking for any_errors_fatal 18911 1727096310.89619: checking for max_fail_percentage 18911 1727096310.89621: done checking for max_fail_percentage 18911 1727096310.89621: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.89622: done checking to see if all hosts have failed 18911 1727096310.89623: getting the remaining hosts for this loop 18911 1727096310.89624: done getting the remaining hosts for this loop 18911 1727096310.89629: getting the next task for host managed_node1 18911 1727096310.89638: done getting next task for host managed_node1 18911 1727096310.89639: ^ task is: TASK: meta (role_complete) 18911 1727096310.89641: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.89651: getting variables 18911 1727096310.89653: in VariableManager get_vars() 18911 1727096310.89693: Calling all_inventory to load vars for managed_node1 18911 1727096310.89696: Calling groups_inventory to load vars for managed_node1 18911 1727096310.89698: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.89707: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.89710: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.89712: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.92846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.94539: done with get_vars() 18911 1727096310.94560: done getting variables 18911 1727096310.94647: done queuing things up, now waiting for results queue to drain 18911 1727096310.94649: results queue empty 18911 1727096310.94649: checking for any_errors_fatal 18911 1727096310.94652: done checking for any_errors_fatal 18911 1727096310.94653: checking for max_fail_percentage 18911 1727096310.94654: done checking for max_fail_percentage 18911 1727096310.94654: checking to see if all hosts have failed and the running result is not ok 18911 1727096310.94655: done checking to see if all hosts have failed 18911 1727096310.94656: getting the remaining hosts for this loop 18911 1727096310.94657: done getting the remaining hosts for this loop 18911 1727096310.94659: getting the next task for host managed_node1 18911 1727096310.94665: done getting next task for host managed_node1 18911 1727096310.94667: ^ task is: TASK: meta (flush_handlers) 18911 1727096310.94673: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096310.94676: getting variables 18911 1727096310.94677: in VariableManager get_vars() 18911 1727096310.94689: Calling all_inventory to load vars for managed_node1 18911 1727096310.94691: Calling groups_inventory to load vars for managed_node1 18911 1727096310.94693: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.94698: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.94700: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.94703: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.95926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096310.97781: done with get_vars() 18911 1727096310.97804: done getting variables 18911 1727096310.97862: in VariableManager get_vars() 18911 1727096310.97880: Calling all_inventory to load vars for managed_node1 18911 1727096310.97883: Calling groups_inventory to load vars for managed_node1 18911 1727096310.97885: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096310.97890: Calling all_plugins_play to load vars for managed_node1 18911 1727096310.97892: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096310.97895: Calling groups_plugins_play to load vars for managed_node1 18911 1727096310.99272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096311.01418: done with get_vars() 18911 1727096311.01570: done queuing things up, now waiting for results queue to drain 18911 1727096311.01572: results queue empty 18911 1727096311.01573: checking for any_errors_fatal 18911 1727096311.01575: done checking for any_errors_fatal 18911 1727096311.01575: checking for max_fail_percentage 18911 1727096311.01577: done checking for max_fail_percentage 18911 1727096311.01577: checking to see if all hosts have failed and the running result is not ok 18911 1727096311.01578: done checking to see if all hosts have failed 18911 1727096311.01579: getting the remaining hosts for this loop 18911 1727096311.01580: done getting the remaining hosts for this loop 18911 1727096311.01583: getting the next task for host managed_node1 18911 1727096311.01587: done getting next task for host managed_node1 18911 1727096311.01589: ^ task is: TASK: meta (flush_handlers) 18911 1727096311.01590: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096311.01593: getting variables 18911 1727096311.01594: in VariableManager get_vars() 18911 1727096311.01609: Calling all_inventory to load vars for managed_node1 18911 1727096311.01611: Calling groups_inventory to load vars for managed_node1 18911 1727096311.01613: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096311.01618: Calling all_plugins_play to load vars for managed_node1 18911 1727096311.01620: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096311.01622: Calling groups_plugins_play to load vars for managed_node1 18911 1727096311.03474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096311.05245: done with get_vars() 18911 1727096311.05269: done getting variables 18911 1727096311.05336: in VariableManager get_vars() 18911 1727096311.05349: Calling all_inventory to load vars for managed_node1 18911 1727096311.05351: Calling groups_inventory to load vars for managed_node1 18911 1727096311.05353: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096311.05358: Calling all_plugins_play to load vars for managed_node1 18911 1727096311.05360: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096311.05365: Calling groups_plugins_play to load vars for managed_node1 18911 1727096311.06560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096311.08399: done with get_vars() 18911 1727096311.08428: done queuing things up, now waiting for results queue to drain 18911 1727096311.08430: results queue empty 18911 1727096311.08431: checking for any_errors_fatal 18911 1727096311.08432: done checking for any_errors_fatal 18911 1727096311.08433: checking for max_fail_percentage 18911 1727096311.08434: done checking for max_fail_percentage 18911 1727096311.08435: checking to see if all hosts have failed and the running result is not ok 18911 1727096311.08436: done checking to see if all hosts have failed 18911 1727096311.08436: getting the remaining hosts for this loop 18911 1727096311.08437: done getting the remaining hosts for this loop 18911 1727096311.08440: getting the next task for host managed_node1 18911 1727096311.08443: done getting next task for host managed_node1 18911 1727096311.08444: ^ task is: None 18911 1727096311.08445: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096311.08446: done queuing things up, now waiting for results queue to drain 18911 1727096311.08447: results queue empty 18911 1727096311.08448: checking for any_errors_fatal 18911 1727096311.08449: done checking for any_errors_fatal 18911 1727096311.08449: checking for max_fail_percentage 18911 1727096311.08450: done checking for max_fail_percentage 18911 1727096311.08451: checking to see if all hosts have failed and the running result is not ok 18911 1727096311.08451: done checking to see if all hosts have failed 18911 1727096311.08452: getting the next task for host managed_node1 18911 1727096311.08459: done getting next task for host managed_node1 18911 1727096311.08460: ^ task is: None 18911 1727096311.08462: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096311.08509: in VariableManager get_vars() 18911 1727096311.08525: done with get_vars() 18911 1727096311.08530: in VariableManager get_vars() 18911 1727096311.08539: done with get_vars() 18911 1727096311.08544: variable 'omit' from source: magic vars 18911 1727096311.08683: in VariableManager get_vars() 18911 1727096311.08719: done with get_vars() 18911 1727096311.08749: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18911 1727096311.08936: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096311.08959: getting the remaining hosts for this loop 18911 1727096311.08961: done getting the remaining hosts for this loop 18911 1727096311.08966: getting the next task for host managed_node1 18911 1727096311.08971: done getting next task for host managed_node1 18911 1727096311.08973: ^ task is: TASK: Gathering Facts 18911 1727096311.08974: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096311.08976: getting variables 18911 1727096311.08977: in VariableManager get_vars() 18911 1727096311.08985: Calling all_inventory to load vars for managed_node1 18911 1727096311.08987: Calling groups_inventory to load vars for managed_node1 18911 1727096311.08989: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096311.08994: Calling all_plugins_play to load vars for managed_node1 18911 1727096311.08997: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096311.08999: Calling groups_plugins_play to load vars for managed_node1 18911 1727096311.10340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096311.12647: done with get_vars() 18911 1727096311.12678: done getting variables 18911 1727096311.12724: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Monday 23 September 2024 08:58:31 -0400 (0:00:00.607) 0:00:30.241 ****** 18911 1727096311.12749: entering _queue_task() for managed_node1/gather_facts 18911 1727096311.13107: worker is 1 (out of 1 available) 18911 1727096311.13124: exiting _queue_task() for managed_node1/gather_facts 18911 1727096311.13136: done queuing things up, now waiting for results queue to drain 18911 1727096311.13138: waiting for pending results... 18911 1727096311.13396: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096311.13462: in run() - task 0afff68d-5257-09a7-aae1-0000000003f8 18911 1727096311.13490: variable 'ansible_search_path' from source: unknown 18911 1727096311.13531: calling self._execute() 18911 1727096311.13629: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096311.13672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096311.13679: variable 'omit' from source: magic vars 18911 1727096311.14085: variable 'ansible_distribution_major_version' from source: facts 18911 1727096311.14102: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096311.14117: variable 'omit' from source: magic vars 18911 1727096311.14223: variable 'omit' from source: magic vars 18911 1727096311.14226: variable 'omit' from source: magic vars 18911 1727096311.14243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096311.14284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096311.14312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096311.14340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096311.14356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096311.14394: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096311.14403: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096311.14412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096311.14514: Set connection var ansible_shell_executable to /bin/sh 18911 1727096311.14524: Set connection var ansible_timeout to 10 18911 1727096311.14529: Set connection var ansible_shell_type to sh 18911 1727096311.14538: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096311.14549: Set connection var ansible_pipelining to False 18911 1727096311.14658: Set connection var ansible_connection to ssh 18911 1727096311.14661: variable 'ansible_shell_executable' from source: unknown 18911 1727096311.14664: variable 'ansible_connection' from source: unknown 18911 1727096311.14666: variable 'ansible_module_compression' from source: unknown 18911 1727096311.14670: variable 'ansible_shell_type' from source: unknown 18911 1727096311.14672: variable 'ansible_shell_executable' from source: unknown 18911 1727096311.14674: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096311.14676: variable 'ansible_pipelining' from source: unknown 18911 1727096311.14678: variable 'ansible_timeout' from source: unknown 18911 1727096311.14680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096311.14815: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096311.14829: variable 'omit' from source: magic vars 18911 1727096311.14837: starting attempt loop 18911 1727096311.14843: running the handler 18911 1727096311.14861: variable 'ansible_facts' from source: unknown 18911 1727096311.14887: _low_level_execute_command(): starting 18911 1727096311.14898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096311.15651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096311.15721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096311.15746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096311.15774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096311.15932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096311.17679: stdout chunk (state=3): >>>/root <<< 18911 1727096311.17863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096311.17870: stdout chunk (state=3): >>><<< 18911 1727096311.17873: stderr chunk (state=3): >>><<< 18911 1727096311.18030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096311.18079: _low_level_execute_command(): starting 18911 1727096311.18083: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492 `" && echo ansible-tmp-1727096311.1803036-20378-169819597523492="` echo /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492 `" ) && sleep 0' 18911 1727096311.19274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096311.19284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096311.19288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096311.19299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096311.19302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096311.19547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096311.19551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096311.19635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096311.21606: stdout chunk (state=3): >>>ansible-tmp-1727096311.1803036-20378-169819597523492=/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492 <<< 18911 1727096311.21752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096311.21762: stdout chunk (state=3): >>><<< 18911 1727096311.21777: stderr chunk (state=3): >>><<< 18911 1727096311.21798: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096311.1803036-20378-169819597523492=/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096311.21831: variable 'ansible_module_compression' from source: unknown 18911 1727096311.21973: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096311.21976: variable 'ansible_facts' from source: unknown 18911 1727096311.22159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py 18911 1727096311.22397: Sending initial data 18911 1727096311.22407: Sent initial data (154 bytes) 18911 1727096311.22927: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096311.22944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096311.22961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096311.23076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096311.23094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096311.23198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096311.24785: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096311.24827: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096311.24885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096311.24981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmphqayfggf /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py <<< 18911 1727096311.25002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py" <<< 18911 1727096311.25057: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmphqayfggf" to remote "/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py" <<< 18911 1727096311.27050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096311.27214: stderr chunk (state=3): >>><<< 18911 1727096311.27218: stdout chunk (state=3): >>><<< 18911 1727096311.27221: done transferring module to remote 18911 1727096311.27223: _low_level_execute_command(): starting 18911 1727096311.27225: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/ /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py && sleep 0' 18911 1727096311.27826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096311.27843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096311.27858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096311.27885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096311.27926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096311.28026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096311.28052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096311.28152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096311.30018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096311.30022: stdout chunk (state=3): >>><<< 18911 1727096311.30024: stderr chunk (state=3): >>><<< 18911 1727096311.30040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096311.30048: _low_level_execute_command(): starting 18911 1727096311.30057: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/AnsiballZ_setup.py && sleep 0' 18911 1727096311.30695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096311.30717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096311.30732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096311.30756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096311.30771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096311.30785: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096311.30836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096311.30898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096311.30916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096311.30950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096311.31069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096311.98280: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "31", "epoch": "1727096311", "epoch_int": "1727096311", "date": "2024-09-23", "time": "08:58:31", "iso8601_micro": "2024-09-23T12:58:31.579592Z", "iso8601": "2024-09-23T12:58:31Z", "iso8601_basic": "20240923T085831579592", "iso8601_basic_short": "20240923T085831", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 464, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795463168, "block_size": 4096, "block_total": 65519099, "block_available": 63914908, "block_used": 1604191, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_loadavg": {"1m": 0.71923828125, "5m": 0.4150390625, "15m": 0.20068359375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096312.00338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096312.00351: stdout chunk (state=3): >>><<< 18911 1727096312.00473: stderr chunk (state=3): >>><<< 18911 1727096312.00577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "31", "epoch": "1727096311", "epoch_int": "1727096311", "date": "2024-09-23", "time": "08:58:31", "iso8601_micro": "2024-09-23T12:58:31.579592Z", "iso8601": "2024-09-23T12:58:31Z", "iso8601_basic": "20240923T085831579592", "iso8601_basic_short": "20240923T085831", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 464, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795463168, "block_size": 4096, "block_total": 65519099, "block_available": 63914908, "block_used": 1604191, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "ea:ad:4d:e4:a2:0e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e8ad:4dff:fee4:a20e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "da:be:ac:0e:e2:e3", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::d8be:acff:fe0e:e2e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5", "fe80::e8ad:4dff:fee4:a20e", "fe80::d8be:acff:fe0e:e2e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5", "fe80::d8be:acff:fe0e:e2e3", "fe80::e8ad:4dff:fee4:a20e"]}, "ansible_loadavg": {"1m": 0.71923828125, "5m": 0.4150390625, "15m": 0.20068359375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096312.01674: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096312.01678: _low_level_execute_command(): starting 18911 1727096312.01681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096311.1803036-20378-169819597523492/ > /dev/null 2>&1 && sleep 0' 18911 1727096312.02694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096312.02698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096312.02701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096312.02838: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.02869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096312.03030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.03124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.05261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.05270: stdout chunk (state=3): >>><<< 18911 1727096312.05273: stderr chunk (state=3): >>><<< 18911 1727096312.05276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096312.05279: handler run complete 18911 1727096312.05593: variable 'ansible_facts' from source: unknown 18911 1727096312.05909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.06798: variable 'ansible_facts' from source: unknown 18911 1727096312.06932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.07385: attempt loop complete, returning result 18911 1727096312.07388: _execute() done 18911 1727096312.07390: dumping result to json 18911 1727096312.07392: done dumping result, returning 18911 1727096312.07394: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-0000000003f8] 18911 1727096312.07572: sending task result for task 0afff68d-5257-09a7-aae1-0000000003f8 ok: [managed_node1] 18911 1727096312.09260: no more pending results, returning what we have 18911 1727096312.09266: results queue empty 18911 1727096312.09292: done sending task result for task 0afff68d-5257-09a7-aae1-0000000003f8 18911 1727096312.09298: WORKER PROCESS EXITING 18911 1727096312.09294: checking for any_errors_fatal 18911 1727096312.09300: done checking for any_errors_fatal 18911 1727096312.09301: checking for max_fail_percentage 18911 1727096312.09303: done checking for max_fail_percentage 18911 1727096312.09304: checking to see if all hosts have failed and the running result is not ok 18911 1727096312.09304: done checking to see if all hosts have failed 18911 1727096312.09305: getting the remaining hosts for this loop 18911 1727096312.09306: done getting the remaining hosts for this loop 18911 1727096312.09310: getting the next task for host managed_node1 18911 1727096312.09315: done getting next task for host managed_node1 18911 1727096312.09317: ^ task is: TASK: meta (flush_handlers) 18911 1727096312.09319: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096312.09323: getting variables 18911 1727096312.09324: in VariableManager get_vars() 18911 1727096312.09459: Calling all_inventory to load vars for managed_node1 18911 1727096312.09465: Calling groups_inventory to load vars for managed_node1 18911 1727096312.09469: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.09479: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.09482: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.09485: Calling groups_plugins_play to load vars for managed_node1 18911 1727096312.12217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.16546: done with get_vars() 18911 1727096312.16573: done getting variables 18911 1727096312.16764: in VariableManager get_vars() 18911 1727096312.16879: Calling all_inventory to load vars for managed_node1 18911 1727096312.16882: Calling groups_inventory to load vars for managed_node1 18911 1727096312.16884: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.16889: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.16892: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.16894: Calling groups_plugins_play to load vars for managed_node1 18911 1727096312.19429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.22957: done with get_vars() 18911 1727096312.23228: done queuing things up, now waiting for results queue to drain 18911 1727096312.23231: results queue empty 18911 1727096312.23232: checking for any_errors_fatal 18911 1727096312.23237: done checking for any_errors_fatal 18911 1727096312.23237: checking for max_fail_percentage 18911 1727096312.23238: done checking for max_fail_percentage 18911 1727096312.23244: checking to see if all hosts have failed and the running result is not ok 18911 1727096312.23245: done checking to see if all hosts have failed 18911 1727096312.23246: getting the remaining hosts for this loop 18911 1727096312.23247: done getting the remaining hosts for this loop 18911 1727096312.23250: getting the next task for host managed_node1 18911 1727096312.23254: done getting next task for host managed_node1 18911 1727096312.23257: ^ task is: TASK: Include the task 'delete_interface.yml' 18911 1727096312.23259: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096312.23262: getting variables 18911 1727096312.23265: in VariableManager get_vars() 18911 1727096312.23428: Calling all_inventory to load vars for managed_node1 18911 1727096312.23431: Calling groups_inventory to load vars for managed_node1 18911 1727096312.23433: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.23440: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.23442: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.23446: Calling groups_plugins_play to load vars for managed_node1 18911 1727096312.27496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.38609: done with get_vars() 18911 1727096312.38636: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Monday 23 September 2024 08:58:32 -0400 (0:00:01.259) 0:00:31.501 ****** 18911 1727096312.38714: entering _queue_task() for managed_node1/include_tasks 18911 1727096312.39647: worker is 1 (out of 1 available) 18911 1727096312.39661: exiting _queue_task() for managed_node1/include_tasks 18911 1727096312.39879: done queuing things up, now waiting for results queue to drain 18911 1727096312.39881: waiting for pending results... 18911 1727096312.40196: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 18911 1727096312.40435: in run() - task 0afff68d-5257-09a7-aae1-000000000054 18911 1727096312.40456: variable 'ansible_search_path' from source: unknown 18911 1727096312.40630: calling self._execute() 18911 1727096312.40734: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096312.40876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096312.40961: variable 'omit' from source: magic vars 18911 1727096312.41826: variable 'ansible_distribution_major_version' from source: facts 18911 1727096312.41830: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096312.41833: _execute() done 18911 1727096312.41836: dumping result to json 18911 1727096312.41839: done dumping result, returning 18911 1727096312.41841: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0afff68d-5257-09a7-aae1-000000000054] 18911 1727096312.41843: sending task result for task 0afff68d-5257-09a7-aae1-000000000054 18911 1727096312.41936: done sending task result for task 0afff68d-5257-09a7-aae1-000000000054 18911 1727096312.41940: WORKER PROCESS EXITING 18911 1727096312.41973: no more pending results, returning what we have 18911 1727096312.41978: in VariableManager get_vars() 18911 1727096312.42014: Calling all_inventory to load vars for managed_node1 18911 1727096312.42016: Calling groups_inventory to load vars for managed_node1 18911 1727096312.42019: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.42032: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.42035: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.42038: Calling groups_plugins_play to load vars for managed_node1 18911 1727096312.44962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.46591: done with get_vars() 18911 1727096312.46614: variable 'ansible_search_path' from source: unknown 18911 1727096312.46631: we have included files to process 18911 1727096312.46632: generating all_blocks data 18911 1727096312.46634: done generating all_blocks data 18911 1727096312.46635: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18911 1727096312.46636: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18911 1727096312.46639: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18911 1727096312.46862: done processing included file 18911 1727096312.46866: iterating over new_blocks loaded from include file 18911 1727096312.46870: in VariableManager get_vars() 18911 1727096312.46882: done with get_vars() 18911 1727096312.46884: filtering new block on tags 18911 1727096312.46899: done filtering new block on tags 18911 1727096312.46901: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 18911 1727096312.46906: extending task lists for all hosts with included blocks 18911 1727096312.46936: done extending task lists 18911 1727096312.46937: done processing included files 18911 1727096312.46938: results queue empty 18911 1727096312.46938: checking for any_errors_fatal 18911 1727096312.46940: done checking for any_errors_fatal 18911 1727096312.46941: checking for max_fail_percentage 18911 1727096312.46942: done checking for max_fail_percentage 18911 1727096312.46942: checking to see if all hosts have failed and the running result is not ok 18911 1727096312.46943: done checking to see if all hosts have failed 18911 1727096312.46944: getting the remaining hosts for this loop 18911 1727096312.46945: done getting the remaining hosts for this loop 18911 1727096312.46947: getting the next task for host managed_node1 18911 1727096312.46950: done getting next task for host managed_node1 18911 1727096312.46952: ^ task is: TASK: Remove test interface if necessary 18911 1727096312.46954: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096312.46956: getting variables 18911 1727096312.46957: in VariableManager get_vars() 18911 1727096312.46971: Calling all_inventory to load vars for managed_node1 18911 1727096312.46973: Calling groups_inventory to load vars for managed_node1 18911 1727096312.46976: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.46981: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.46983: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.46986: Calling groups_plugins_play to load vars for managed_node1 18911 1727096312.49272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096312.52798: done with get_vars() 18911 1727096312.52827: done getting variables 18911 1727096312.52984: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Monday 23 September 2024 08:58:32 -0400 (0:00:00.142) 0:00:31.644 ****** 18911 1727096312.53018: entering _queue_task() for managed_node1/command 18911 1727096312.53757: worker is 1 (out of 1 available) 18911 1727096312.53774: exiting _queue_task() for managed_node1/command 18911 1727096312.53786: done queuing things up, now waiting for results queue to drain 18911 1727096312.53787: waiting for pending results... 18911 1727096312.54261: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 18911 1727096312.54487: in run() - task 0afff68d-5257-09a7-aae1-000000000409 18911 1727096312.54510: variable 'ansible_search_path' from source: unknown 18911 1727096312.54517: variable 'ansible_search_path' from source: unknown 18911 1727096312.54898: calling self._execute() 18911 1727096312.55026: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096312.55172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096312.55175: variable 'omit' from source: magic vars 18911 1727096312.55849: variable 'ansible_distribution_major_version' from source: facts 18911 1727096312.55890: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096312.56073: variable 'omit' from source: magic vars 18911 1727096312.56077: variable 'omit' from source: magic vars 18911 1727096312.56235: variable 'interface' from source: set_fact 18911 1727096312.56260: variable 'omit' from source: magic vars 18911 1727096312.56352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096312.56454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096312.56549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096312.56576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096312.56593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096312.56630: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096312.56874: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096312.56877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096312.56880: Set connection var ansible_shell_executable to /bin/sh 18911 1727096312.56882: Set connection var ansible_timeout to 10 18911 1727096312.56885: Set connection var ansible_shell_type to sh 18911 1727096312.56991: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096312.57002: Set connection var ansible_pipelining to False 18911 1727096312.57013: Set connection var ansible_connection to ssh 18911 1727096312.57042: variable 'ansible_shell_executable' from source: unknown 18911 1727096312.57096: variable 'ansible_connection' from source: unknown 18911 1727096312.57105: variable 'ansible_module_compression' from source: unknown 18911 1727096312.57112: variable 'ansible_shell_type' from source: unknown 18911 1727096312.57118: variable 'ansible_shell_executable' from source: unknown 18911 1727096312.57125: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096312.57133: variable 'ansible_pipelining' from source: unknown 18911 1727096312.57139: variable 'ansible_timeout' from source: unknown 18911 1727096312.57147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096312.57373: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096312.57674: variable 'omit' from source: magic vars 18911 1727096312.57677: starting attempt loop 18911 1727096312.57680: running the handler 18911 1727096312.57682: _low_level_execute_command(): starting 18911 1727096312.57684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096312.59129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096312.59147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096312.59158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.59295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096312.59309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096312.59480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.59579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.61316: stdout chunk (state=3): >>>/root <<< 18911 1727096312.61413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.61501: stderr chunk (state=3): >>><<< 18911 1727096312.61504: stdout chunk (state=3): >>><<< 18911 1727096312.61526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096312.61546: _low_level_execute_command(): starting 18911 1727096312.61663: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218 `" && echo ansible-tmp-1727096312.6153264-20434-118066937155218="` echo /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218 `" ) && sleep 0' 18911 1727096312.62725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096312.62884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096312.62974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096312.63124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.63204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.65184: stdout chunk (state=3): >>>ansible-tmp-1727096312.6153264-20434-118066937155218=/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218 <<< 18911 1727096312.65323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.65333: stdout chunk (state=3): >>><<< 18911 1727096312.65345: stderr chunk (state=3): >>><<< 18911 1727096312.65370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096312.6153264-20434-118066937155218=/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096312.65774: variable 'ansible_module_compression' from source: unknown 18911 1727096312.65777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096312.65779: variable 'ansible_facts' from source: unknown 18911 1727096312.65781: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py 18911 1727096312.66116: Sending initial data 18911 1727096312.66181: Sent initial data (156 bytes) 18911 1727096312.67298: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096312.67312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096312.67323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.67372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096312.67408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096312.67625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.67717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.69362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096312.69419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096312.69501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpuemgcuua /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py <<< 18911 1727096312.69526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py" <<< 18911 1727096312.69698: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpuemgcuua" to remote "/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py" <<< 18911 1727096312.71412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.71472: stderr chunk (state=3): >>><<< 18911 1727096312.71674: stdout chunk (state=3): >>><<< 18911 1727096312.71677: done transferring module to remote 18911 1727096312.71679: _low_level_execute_command(): starting 18911 1727096312.71680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/ /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py && sleep 0' 18911 1727096312.72898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096312.72902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096312.73092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096312.73099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.73244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.75101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.75138: stderr chunk (state=3): >>><<< 18911 1727096312.75141: stdout chunk (state=3): >>><<< 18911 1727096312.75157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096312.75164: _low_level_execute_command(): starting 18911 1727096312.75194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/AnsiballZ_command.py && sleep 0' 18911 1727096312.76323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096312.76326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.76329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096312.76331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.76472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096312.76677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.76680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.93560: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-23 08:58:32.919717", "end": "2024-09-23 08:58:32.933068", "delta": "0:00:00.013351", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096312.96080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096312.96084: stdout chunk (state=3): >>><<< 18911 1727096312.96086: stderr chunk (state=3): >>><<< 18911 1727096312.96090: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-23 08:58:32.919717", "end": "2024-09-23 08:58:32.933068", "delta": "0:00:00.013351", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096312.96102: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096312.96116: _low_level_execute_command(): starting 18911 1727096312.96126: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096312.6153264-20434-118066937155218/ > /dev/null 2>&1 && sleep 0' 18911 1727096312.96776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096312.96792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096312.96806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096312.96832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096312.96874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096312.96890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096312.97003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096312.97083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096312.98989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096312.98999: stderr chunk (state=3): >>><<< 18911 1727096312.99002: stdout chunk (state=3): >>><<< 18911 1727096312.99018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096312.99023: handler run complete 18911 1727096312.99045: Evaluated conditional (False): False 18911 1727096312.99054: attempt loop complete, returning result 18911 1727096312.99057: _execute() done 18911 1727096312.99059: dumping result to json 18911 1727096312.99066: done dumping result, returning 18911 1727096312.99074: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0afff68d-5257-09a7-aae1-000000000409] 18911 1727096312.99078: sending task result for task 0afff68d-5257-09a7-aae1-000000000409 18911 1727096312.99173: done sending task result for task 0afff68d-5257-09a7-aae1-000000000409 18911 1727096312.99176: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.013351", "end": "2024-09-23 08:58:32.933068", "rc": 0, "start": "2024-09-23 08:58:32.919717" } 18911 1727096312.99233: no more pending results, returning what we have 18911 1727096312.99236: results queue empty 18911 1727096312.99237: checking for any_errors_fatal 18911 1727096312.99239: done checking for any_errors_fatal 18911 1727096312.99239: checking for max_fail_percentage 18911 1727096312.99241: done checking for max_fail_percentage 18911 1727096312.99241: checking to see if all hosts have failed and the running result is not ok 18911 1727096312.99242: done checking to see if all hosts have failed 18911 1727096312.99243: getting the remaining hosts for this loop 18911 1727096312.99244: done getting the remaining hosts for this loop 18911 1727096312.99247: getting the next task for host managed_node1 18911 1727096312.99257: done getting next task for host managed_node1 18911 1727096312.99259: ^ task is: TASK: meta (flush_handlers) 18911 1727096312.99261: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096312.99270: getting variables 18911 1727096312.99272: in VariableManager get_vars() 18911 1727096312.99302: Calling all_inventory to load vars for managed_node1 18911 1727096312.99304: Calling groups_inventory to load vars for managed_node1 18911 1727096312.99308: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096312.99318: Calling all_plugins_play to load vars for managed_node1 18911 1727096312.99321: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096312.99323: Calling groups_plugins_play to load vars for managed_node1 18911 1727096313.00299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096313.01532: done with get_vars() 18911 1727096313.01552: done getting variables 18911 1727096313.01607: in VariableManager get_vars() 18911 1727096313.01614: Calling all_inventory to load vars for managed_node1 18911 1727096313.01616: Calling groups_inventory to load vars for managed_node1 18911 1727096313.01617: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096313.01621: Calling all_plugins_play to load vars for managed_node1 18911 1727096313.01622: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096313.01624: Calling groups_plugins_play to load vars for managed_node1 18911 1727096313.02378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096313.03432: done with get_vars() 18911 1727096313.03470: done queuing things up, now waiting for results queue to drain 18911 1727096313.03473: results queue empty 18911 1727096313.03473: checking for any_errors_fatal 18911 1727096313.03477: done checking for any_errors_fatal 18911 1727096313.03478: checking for max_fail_percentage 18911 1727096313.03479: done checking for max_fail_percentage 18911 1727096313.03479: checking to see if all hosts have failed and the running result is not ok 18911 1727096313.03480: done checking to see if all hosts have failed 18911 1727096313.03481: getting the remaining hosts for this loop 18911 1727096313.03482: done getting the remaining hosts for this loop 18911 1727096313.03485: getting the next task for host managed_node1 18911 1727096313.03489: done getting next task for host managed_node1 18911 1727096313.03490: ^ task is: TASK: meta (flush_handlers) 18911 1727096313.03492: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096313.03494: getting variables 18911 1727096313.03495: in VariableManager get_vars() 18911 1727096313.03505: Calling all_inventory to load vars for managed_node1 18911 1727096313.03507: Calling groups_inventory to load vars for managed_node1 18911 1727096313.03510: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096313.03516: Calling all_plugins_play to load vars for managed_node1 18911 1727096313.03518: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096313.03521: Calling groups_plugins_play to load vars for managed_node1 18911 1727096313.04700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096313.06380: done with get_vars() 18911 1727096313.06401: done getting variables 18911 1727096313.06455: in VariableManager get_vars() 18911 1727096313.06469: Calling all_inventory to load vars for managed_node1 18911 1727096313.06471: Calling groups_inventory to load vars for managed_node1 18911 1727096313.06474: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096313.06479: Calling all_plugins_play to load vars for managed_node1 18911 1727096313.06481: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096313.06484: Calling groups_plugins_play to load vars for managed_node1 18911 1727096313.07804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096313.09416: done with get_vars() 18911 1727096313.09447: done queuing things up, now waiting for results queue to drain 18911 1727096313.09449: results queue empty 18911 1727096313.09450: checking for any_errors_fatal 18911 1727096313.09452: done checking for any_errors_fatal 18911 1727096313.09452: checking for max_fail_percentage 18911 1727096313.09453: done checking for max_fail_percentage 18911 1727096313.09454: checking to see if all hosts have failed and the running result is not ok 18911 1727096313.09455: done checking to see if all hosts have failed 18911 1727096313.09455: getting the remaining hosts for this loop 18911 1727096313.09456: done getting the remaining hosts for this loop 18911 1727096313.09459: getting the next task for host managed_node1 18911 1727096313.09465: done getting next task for host managed_node1 18911 1727096313.09466: ^ task is: None 18911 1727096313.09470: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096313.09471: done queuing things up, now waiting for results queue to drain 18911 1727096313.09472: results queue empty 18911 1727096313.09473: checking for any_errors_fatal 18911 1727096313.09473: done checking for any_errors_fatal 18911 1727096313.09474: checking for max_fail_percentage 18911 1727096313.09475: done checking for max_fail_percentage 18911 1727096313.09476: checking to see if all hosts have failed and the running result is not ok 18911 1727096313.09476: done checking to see if all hosts have failed 18911 1727096313.09477: getting the next task for host managed_node1 18911 1727096313.09480: done getting next task for host managed_node1 18911 1727096313.09481: ^ task is: None 18911 1727096313.09482: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096313.09520: in VariableManager get_vars() 18911 1727096313.09542: done with get_vars() 18911 1727096313.09549: in VariableManager get_vars() 18911 1727096313.09562: done with get_vars() 18911 1727096313.09572: variable 'omit' from source: magic vars 18911 1727096313.09697: variable 'profile' from source: play vars 18911 1727096313.09800: in VariableManager get_vars() 18911 1727096313.09815: done with get_vars() 18911 1727096313.09837: variable 'omit' from source: magic vars 18911 1727096313.09904: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18911 1727096313.10607: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096313.10628: getting the remaining hosts for this loop 18911 1727096313.10629: done getting the remaining hosts for this loop 18911 1727096313.10631: getting the next task for host managed_node1 18911 1727096313.10634: done getting next task for host managed_node1 18911 1727096313.10635: ^ task is: TASK: Gathering Facts 18911 1727096313.10637: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096313.10639: getting variables 18911 1727096313.10640: in VariableManager get_vars() 18911 1727096313.10652: Calling all_inventory to load vars for managed_node1 18911 1727096313.10657: Calling groups_inventory to load vars for managed_node1 18911 1727096313.10659: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096313.10666: Calling all_plugins_play to load vars for managed_node1 18911 1727096313.10670: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096313.10673: Calling groups_plugins_play to load vars for managed_node1 18911 1727096313.11984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096313.13526: done with get_vars() 18911 1727096313.13546: done getting variables 18911 1727096313.13592: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Monday 23 September 2024 08:58:33 -0400 (0:00:00.605) 0:00:32.250 ****** 18911 1727096313.13618: entering _queue_task() for managed_node1/gather_facts 18911 1727096313.13949: worker is 1 (out of 1 available) 18911 1727096313.13961: exiting _queue_task() for managed_node1/gather_facts 18911 1727096313.13974: done queuing things up, now waiting for results queue to drain 18911 1727096313.13976: waiting for pending results... 18911 1727096313.14249: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096313.14573: in run() - task 0afff68d-5257-09a7-aae1-000000000417 18911 1727096313.14577: variable 'ansible_search_path' from source: unknown 18911 1727096313.14580: calling self._execute() 18911 1727096313.14582: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096313.14586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096313.14588: variable 'omit' from source: magic vars 18911 1727096313.14903: variable 'ansible_distribution_major_version' from source: facts 18911 1727096313.14928: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096313.14940: variable 'omit' from source: magic vars 18911 1727096313.14976: variable 'omit' from source: magic vars 18911 1727096313.15018: variable 'omit' from source: magic vars 18911 1727096313.15069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096313.15109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096313.15140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096313.15163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096313.15183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096313.15216: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096313.15226: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096313.15234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096313.15341: Set connection var ansible_shell_executable to /bin/sh 18911 1727096313.15357: Set connection var ansible_timeout to 10 18911 1727096313.15363: Set connection var ansible_shell_type to sh 18911 1727096313.15378: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096313.15387: Set connection var ansible_pipelining to False 18911 1727096313.15395: Set connection var ansible_connection to ssh 18911 1727096313.15421: variable 'ansible_shell_executable' from source: unknown 18911 1727096313.15429: variable 'ansible_connection' from source: unknown 18911 1727096313.15437: variable 'ansible_module_compression' from source: unknown 18911 1727096313.15446: variable 'ansible_shell_type' from source: unknown 18911 1727096313.15453: variable 'ansible_shell_executable' from source: unknown 18911 1727096313.15572: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096313.15576: variable 'ansible_pipelining' from source: unknown 18911 1727096313.15578: variable 'ansible_timeout' from source: unknown 18911 1727096313.15581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096313.15662: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096313.15685: variable 'omit' from source: magic vars 18911 1727096313.15694: starting attempt loop 18911 1727096313.15702: running the handler 18911 1727096313.15722: variable 'ansible_facts' from source: unknown 18911 1727096313.15747: _low_level_execute_command(): starting 18911 1727096313.15759: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096313.16480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096313.16494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096313.16562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096313.16617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096313.16637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096313.16672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096313.16779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096313.18504: stdout chunk (state=3): >>>/root <<< 18911 1727096313.18637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096313.18650: stdout chunk (state=3): >>><<< 18911 1727096313.18672: stderr chunk (state=3): >>><<< 18911 1727096313.18782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096313.18785: _low_level_execute_command(): starting 18911 1727096313.18788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907 `" && echo ansible-tmp-1727096313.186953-20474-122108532924907="` echo /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907 `" ) && sleep 0' 18911 1727096313.19587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096313.19612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096313.19629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096313.19703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096313.19805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096313.21735: stdout chunk (state=3): >>>ansible-tmp-1727096313.186953-20474-122108532924907=/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907 <<< 18911 1727096313.21893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096313.21896: stderr chunk (state=3): >>><<< 18911 1727096313.21922: stdout chunk (state=3): >>><<< 18911 1727096313.21949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096313.186953-20474-122108532924907=/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096313.21996: variable 'ansible_module_compression' from source: unknown 18911 1727096313.22050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096313.22123: variable 'ansible_facts' from source: unknown 18911 1727096313.22351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py 18911 1727096313.22541: Sending initial data 18911 1727096313.22545: Sent initial data (153 bytes) 18911 1727096313.23158: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096313.23183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096313.23288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096313.23308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096313.23321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096313.23417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096313.24994: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096313.25086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096313.25166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdr7kpw1l /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py <<< 18911 1727096313.25184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py" <<< 18911 1727096313.25231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdr7kpw1l" to remote "/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py" <<< 18911 1727096313.27278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096313.27283: stdout chunk (state=3): >>><<< 18911 1727096313.27286: stderr chunk (state=3): >>><<< 18911 1727096313.27288: done transferring module to remote 18911 1727096313.27290: _low_level_execute_command(): starting 18911 1727096313.27293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/ /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py && sleep 0' 18911 1727096313.28460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096313.28480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096313.28574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096313.28874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096313.30639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096313.30682: stdout chunk (state=3): >>><<< 18911 1727096313.30696: stderr chunk (state=3): >>><<< 18911 1727096313.30722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096313.30762: _low_level_execute_command(): starting 18911 1727096313.30788: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/AnsiballZ_setup.py && sleep 0' 18911 1727096313.31831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096313.31866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096313.31972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096313.32066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096313.32189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096313.96093: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_loadavg": {"1m": 0.71923828125, "5m": 0.4150390625, "15m": 0.20068359375}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "33", "epoch": "1727096313", "epoch_int": "1727096313", "date": "2024-09-23", "time": "08:58:33", "iso8601_micro": "2024-09-23T12:58:33.595038Z", "iso8601": "2024-09-23T12:58:33Z", "iso8601_basic": "20240923T085833595038", "iso8601_basic_short": "20240923T085833", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 466, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795397632, "block_size": 4096, "block_total": 65519099, "block_available": 63914892, "block_used": 1604207, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096313.98399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096313.98574: stderr chunk (state=3): >>><<< 18911 1727096313.98578: stdout chunk (state=3): >>><<< 18911 1727096313.98582: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_loadavg": {"1m": 0.71923828125, "5m": 0.4150390625, "15m": 0.20068359375}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "33", "epoch": "1727096313", "epoch_int": "1727096313", "date": "2024-09-23", "time": "08:58:33", "iso8601_micro": "2024-09-23T12:58:33.595038Z", "iso8601": "2024-09-23T12:58:33Z", "iso8601_basic": "20240923T085833595038", "iso8601_basic_short": "20240923T085833", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 466, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795397632, "block_size": 4096, "block_total": 65519099, "block_available": 63914892, "block_used": 1604207, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096313.99256: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096313.99870: _low_level_execute_command(): starting 18911 1727096313.99874: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096313.186953-20474-122108532924907/ > /dev/null 2>&1 && sleep 0' 18911 1727096314.00971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096314.00975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096314.01266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096314.01494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096314.03535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096314.03546: stdout chunk (state=3): >>><<< 18911 1727096314.03558: stderr chunk (state=3): >>><<< 18911 1727096314.03586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096314.03604: handler run complete 18911 1727096314.03854: variable 'ansible_facts' from source: unknown 18911 1727096314.04186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.04741: variable 'ansible_facts' from source: unknown 18911 1727096314.04866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.05176: attempt loop complete, returning result 18911 1727096314.05280: _execute() done 18911 1727096314.05288: dumping result to json 18911 1727096314.05435: done dumping result, returning 18911 1727096314.05438: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-000000000417] 18911 1727096314.05442: sending task result for task 0afff68d-5257-09a7-aae1-000000000417 18911 1727096314.06373: done sending task result for task 0afff68d-5257-09a7-aae1-000000000417 18911 1727096314.06376: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096314.07022: no more pending results, returning what we have 18911 1727096314.07025: results queue empty 18911 1727096314.07026: checking for any_errors_fatal 18911 1727096314.07027: done checking for any_errors_fatal 18911 1727096314.07028: checking for max_fail_percentage 18911 1727096314.07030: done checking for max_fail_percentage 18911 1727096314.07031: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.07031: done checking to see if all hosts have failed 18911 1727096314.07032: getting the remaining hosts for this loop 18911 1727096314.07033: done getting the remaining hosts for this loop 18911 1727096314.07037: getting the next task for host managed_node1 18911 1727096314.07043: done getting next task for host managed_node1 18911 1727096314.07045: ^ task is: TASK: meta (flush_handlers) 18911 1727096314.07047: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.07050: getting variables 18911 1727096314.07052: in VariableManager get_vars() 18911 1727096314.07084: Calling all_inventory to load vars for managed_node1 18911 1727096314.07087: Calling groups_inventory to load vars for managed_node1 18911 1727096314.07089: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.07101: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.07103: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.07106: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.09842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.15282: done with get_vars() 18911 1727096314.15305: done getting variables 18911 1727096314.15787: in VariableManager get_vars() 18911 1727096314.15802: Calling all_inventory to load vars for managed_node1 18911 1727096314.15804: Calling groups_inventory to load vars for managed_node1 18911 1727096314.15806: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.15812: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.15814: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.15817: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.19382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.24225: done with get_vars() 18911 1727096314.24260: done queuing things up, now waiting for results queue to drain 18911 1727096314.24265: results queue empty 18911 1727096314.24265: checking for any_errors_fatal 18911 1727096314.24272: done checking for any_errors_fatal 18911 1727096314.24273: checking for max_fail_percentage 18911 1727096314.24274: done checking for max_fail_percentage 18911 1727096314.24275: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.24276: done checking to see if all hosts have failed 18911 1727096314.24281: getting the remaining hosts for this loop 18911 1727096314.24282: done getting the remaining hosts for this loop 18911 1727096314.24284: getting the next task for host managed_node1 18911 1727096314.24288: done getting next task for host managed_node1 18911 1727096314.24291: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096314.24292: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.24302: getting variables 18911 1727096314.24303: in VariableManager get_vars() 18911 1727096314.24317: Calling all_inventory to load vars for managed_node1 18911 1727096314.24319: Calling groups_inventory to load vars for managed_node1 18911 1727096314.24321: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.24325: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.24327: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.24330: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.28390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.31982: done with get_vars() 18911 1727096314.32013: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:58:34 -0400 (0:00:01.185) 0:00:33.435 ****** 18911 1727096314.32136: entering _queue_task() for managed_node1/include_tasks 18911 1727096314.33060: worker is 1 (out of 1 available) 18911 1727096314.33077: exiting _queue_task() for managed_node1/include_tasks 18911 1727096314.33088: done queuing things up, now waiting for results queue to drain 18911 1727096314.33089: waiting for pending results... 18911 1727096314.33500: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18911 1727096314.33903: in run() - task 0afff68d-5257-09a7-aae1-00000000005c 18911 1727096314.33907: variable 'ansible_search_path' from source: unknown 18911 1727096314.33910: variable 'ansible_search_path' from source: unknown 18911 1727096314.33913: calling self._execute() 18911 1727096314.34158: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.34228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.34232: variable 'omit' from source: magic vars 18911 1727096314.35093: variable 'ansible_distribution_major_version' from source: facts 18911 1727096314.35096: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096314.35099: _execute() done 18911 1727096314.35101: dumping result to json 18911 1727096314.35103: done dumping result, returning 18911 1727096314.35105: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-09a7-aae1-00000000005c] 18911 1727096314.35107: sending task result for task 0afff68d-5257-09a7-aae1-00000000005c 18911 1727096314.35241: no more pending results, returning what we have 18911 1727096314.35247: in VariableManager get_vars() 18911 1727096314.35297: Calling all_inventory to load vars for managed_node1 18911 1727096314.35300: Calling groups_inventory to load vars for managed_node1 18911 1727096314.35304: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.35317: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.35320: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.35323: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.36174: done sending task result for task 0afff68d-5257-09a7-aae1-00000000005c 18911 1727096314.36179: WORKER PROCESS EXITING 18911 1727096314.38430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.41917: done with get_vars() 18911 1727096314.41945: variable 'ansible_search_path' from source: unknown 18911 1727096314.42062: variable 'ansible_search_path' from source: unknown 18911 1727096314.42100: we have included files to process 18911 1727096314.42102: generating all_blocks data 18911 1727096314.42103: done generating all_blocks data 18911 1727096314.42104: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096314.42105: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096314.42108: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18911 1727096314.43419: done processing included file 18911 1727096314.43421: iterating over new_blocks loaded from include file 18911 1727096314.43422: in VariableManager get_vars() 18911 1727096314.43446: done with get_vars() 18911 1727096314.43448: filtering new block on tags 18911 1727096314.43467: done filtering new block on tags 18911 1727096314.43471: in VariableManager get_vars() 18911 1727096314.43593: done with get_vars() 18911 1727096314.43595: filtering new block on tags 18911 1727096314.43614: done filtering new block on tags 18911 1727096314.43617: in VariableManager get_vars() 18911 1727096314.43636: done with get_vars() 18911 1727096314.43637: filtering new block on tags 18911 1727096314.43653: done filtering new block on tags 18911 1727096314.43655: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18911 1727096314.43660: extending task lists for all hosts with included blocks 18911 1727096314.44597: done extending task lists 18911 1727096314.44598: done processing included files 18911 1727096314.44599: results queue empty 18911 1727096314.44600: checking for any_errors_fatal 18911 1727096314.44601: done checking for any_errors_fatal 18911 1727096314.44602: checking for max_fail_percentage 18911 1727096314.44603: done checking for max_fail_percentage 18911 1727096314.44604: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.44605: done checking to see if all hosts have failed 18911 1727096314.44605: getting the remaining hosts for this loop 18911 1727096314.44606: done getting the remaining hosts for this loop 18911 1727096314.44609: getting the next task for host managed_node1 18911 1727096314.44613: done getting next task for host managed_node1 18911 1727096314.44615: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096314.44618: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.44628: getting variables 18911 1727096314.44629: in VariableManager get_vars() 18911 1727096314.44646: Calling all_inventory to load vars for managed_node1 18911 1727096314.44648: Calling groups_inventory to load vars for managed_node1 18911 1727096314.44650: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.44656: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.44658: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.44661: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.47478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.50855: done with get_vars() 18911 1727096314.51027: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:58:34 -0400 (0:00:00.189) 0:00:33.625 ****** 18911 1727096314.51171: entering _queue_task() for managed_node1/setup 18911 1727096314.51834: worker is 1 (out of 1 available) 18911 1727096314.51847: exiting _queue_task() for managed_node1/setup 18911 1727096314.51859: done queuing things up, now waiting for results queue to drain 18911 1727096314.51861: waiting for pending results... 18911 1727096314.52784: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18911 1727096314.52869: in run() - task 0afff68d-5257-09a7-aae1-000000000458 18911 1727096314.53236: variable 'ansible_search_path' from source: unknown 18911 1727096314.53240: variable 'ansible_search_path' from source: unknown 18911 1727096314.53243: calling self._execute() 18911 1727096314.53386: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.53571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.53774: variable 'omit' from source: magic vars 18911 1727096314.54660: variable 'ansible_distribution_major_version' from source: facts 18911 1727096314.54948: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096314.55592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096314.61503: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096314.61566: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096314.61652: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096314.61756: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096314.61937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096314.62056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096314.62093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096314.62124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096314.62200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096314.62283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096314.62472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096314.62477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096314.62507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096314.62549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096314.62813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096314.62952: variable '__network_required_facts' from source: role '' defaults 18911 1727096314.63045: variable 'ansible_facts' from source: unknown 18911 1727096314.64775: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18911 1727096314.64780: when evaluation is False, skipping this task 18911 1727096314.64782: _execute() done 18911 1727096314.64784: dumping result to json 18911 1727096314.64786: done dumping result, returning 18911 1727096314.64789: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-09a7-aae1-000000000458] 18911 1727096314.64791: sending task result for task 0afff68d-5257-09a7-aae1-000000000458 18911 1727096314.64859: done sending task result for task 0afff68d-5257-09a7-aae1-000000000458 18911 1727096314.64861: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096314.64915: no more pending results, returning what we have 18911 1727096314.64919: results queue empty 18911 1727096314.64921: checking for any_errors_fatal 18911 1727096314.64922: done checking for any_errors_fatal 18911 1727096314.64923: checking for max_fail_percentage 18911 1727096314.64925: done checking for max_fail_percentage 18911 1727096314.64926: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.64927: done checking to see if all hosts have failed 18911 1727096314.64927: getting the remaining hosts for this loop 18911 1727096314.64929: done getting the remaining hosts for this loop 18911 1727096314.64933: getting the next task for host managed_node1 18911 1727096314.64942: done getting next task for host managed_node1 18911 1727096314.64946: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096314.64949: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.64963: getting variables 18911 1727096314.64965: in VariableManager get_vars() 18911 1727096314.65008: Calling all_inventory to load vars for managed_node1 18911 1727096314.65011: Calling groups_inventory to load vars for managed_node1 18911 1727096314.65014: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.65026: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.65029: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.65032: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.68696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.71774: done with get_vars() 18911 1727096314.71800: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:58:34 -0400 (0:00:00.207) 0:00:33.833 ****** 18911 1727096314.71910: entering _queue_task() for managed_node1/stat 18911 1727096314.72380: worker is 1 (out of 1 available) 18911 1727096314.72479: exiting _queue_task() for managed_node1/stat 18911 1727096314.72490: done queuing things up, now waiting for results queue to drain 18911 1727096314.72492: waiting for pending results... 18911 1727096314.72980: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18911 1727096314.73375: in run() - task 0afff68d-5257-09a7-aae1-00000000045a 18911 1727096314.73384: variable 'ansible_search_path' from source: unknown 18911 1727096314.73387: variable 'ansible_search_path' from source: unknown 18911 1727096314.73390: calling self._execute() 18911 1727096314.73673: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.73677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.73680: variable 'omit' from source: magic vars 18911 1727096314.74461: variable 'ansible_distribution_major_version' from source: facts 18911 1727096314.74488: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096314.74678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096314.74983: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096314.75029: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096314.75072: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096314.75111: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096314.75205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096314.75281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096314.75286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096314.75288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096314.75390: variable '__network_is_ostree' from source: set_fact 18911 1727096314.75397: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096314.75400: when evaluation is False, skipping this task 18911 1727096314.75403: _execute() done 18911 1727096314.75407: dumping result to json 18911 1727096314.75410: done dumping result, returning 18911 1727096314.75418: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-09a7-aae1-00000000045a] 18911 1727096314.75423: sending task result for task 0afff68d-5257-09a7-aae1-00000000045a 18911 1727096314.75672: done sending task result for task 0afff68d-5257-09a7-aae1-00000000045a 18911 1727096314.75675: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096314.75717: no more pending results, returning what we have 18911 1727096314.75720: results queue empty 18911 1727096314.75721: checking for any_errors_fatal 18911 1727096314.75726: done checking for any_errors_fatal 18911 1727096314.75726: checking for max_fail_percentage 18911 1727096314.75728: done checking for max_fail_percentage 18911 1727096314.75729: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.75730: done checking to see if all hosts have failed 18911 1727096314.75730: getting the remaining hosts for this loop 18911 1727096314.75731: done getting the remaining hosts for this loop 18911 1727096314.75734: getting the next task for host managed_node1 18911 1727096314.75739: done getting next task for host managed_node1 18911 1727096314.75742: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096314.75745: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.75757: getting variables 18911 1727096314.75758: in VariableManager get_vars() 18911 1727096314.75797: Calling all_inventory to load vars for managed_node1 18911 1727096314.75800: Calling groups_inventory to load vars for managed_node1 18911 1727096314.75802: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.75811: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.75813: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.75816: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.78776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.82208: done with get_vars() 18911 1727096314.82240: done getting variables 18911 1727096314.82423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:58:34 -0400 (0:00:00.105) 0:00:33.938 ****** 18911 1727096314.82458: entering _queue_task() for managed_node1/set_fact 18911 1727096314.83213: worker is 1 (out of 1 available) 18911 1727096314.83449: exiting _queue_task() for managed_node1/set_fact 18911 1727096314.83461: done queuing things up, now waiting for results queue to drain 18911 1727096314.83463: waiting for pending results... 18911 1727096314.83848: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18911 1727096314.84320: in run() - task 0afff68d-5257-09a7-aae1-00000000045b 18911 1727096314.84324: variable 'ansible_search_path' from source: unknown 18911 1727096314.84328: variable 'ansible_search_path' from source: unknown 18911 1727096314.84331: calling self._execute() 18911 1727096314.84410: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.84536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.84540: variable 'omit' from source: magic vars 18911 1727096314.85392: variable 'ansible_distribution_major_version' from source: facts 18911 1727096314.85419: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096314.85783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096314.86116: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096314.86179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096314.86222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096314.86264: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096314.86365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096314.86414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096314.86447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096314.86479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096314.86582: variable '__network_is_ostree' from source: set_fact 18911 1727096314.86605: Evaluated conditional (not __network_is_ostree is defined): False 18911 1727096314.86614: when evaluation is False, skipping this task 18911 1727096314.86621: _execute() done 18911 1727096314.86628: dumping result to json 18911 1727096314.86672: done dumping result, returning 18911 1727096314.86676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-09a7-aae1-00000000045b] 18911 1727096314.86678: sending task result for task 0afff68d-5257-09a7-aae1-00000000045b 18911 1727096314.86909: done sending task result for task 0afff68d-5257-09a7-aae1-00000000045b 18911 1727096314.86912: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18911 1727096314.86965: no more pending results, returning what we have 18911 1727096314.86981: results queue empty 18911 1727096314.86982: checking for any_errors_fatal 18911 1727096314.86991: done checking for any_errors_fatal 18911 1727096314.86991: checking for max_fail_percentage 18911 1727096314.86993: done checking for max_fail_percentage 18911 1727096314.86994: checking to see if all hosts have failed and the running result is not ok 18911 1727096314.86995: done checking to see if all hosts have failed 18911 1727096314.86996: getting the remaining hosts for this loop 18911 1727096314.86998: done getting the remaining hosts for this loop 18911 1727096314.87002: getting the next task for host managed_node1 18911 1727096314.87012: done getting next task for host managed_node1 18911 1727096314.87017: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096314.87026: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096314.87044: getting variables 18911 1727096314.87046: in VariableManager get_vars() 18911 1727096314.87090: Calling all_inventory to load vars for managed_node1 18911 1727096314.87092: Calling groups_inventory to load vars for managed_node1 18911 1727096314.87095: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096314.87106: Calling all_plugins_play to load vars for managed_node1 18911 1727096314.87109: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096314.87111: Calling groups_plugins_play to load vars for managed_node1 18911 1727096314.89934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096314.92083: done with get_vars() 18911 1727096314.92174: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:58:34 -0400 (0:00:00.098) 0:00:34.037 ****** 18911 1727096314.92341: entering _queue_task() for managed_node1/service_facts 18911 1727096314.93157: worker is 1 (out of 1 available) 18911 1727096314.93173: exiting _queue_task() for managed_node1/service_facts 18911 1727096314.93185: done queuing things up, now waiting for results queue to drain 18911 1727096314.93187: waiting for pending results... 18911 1727096314.93786: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18911 1727096314.94020: in run() - task 0afff68d-5257-09a7-aae1-00000000045d 18911 1727096314.94024: variable 'ansible_search_path' from source: unknown 18911 1727096314.94028: variable 'ansible_search_path' from source: unknown 18911 1727096314.94105: calling self._execute() 18911 1727096314.94347: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.94351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.94570: variable 'omit' from source: magic vars 18911 1727096314.95161: variable 'ansible_distribution_major_version' from source: facts 18911 1727096314.95182: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096314.95228: variable 'omit' from source: magic vars 18911 1727096314.95289: variable 'omit' from source: magic vars 18911 1727096314.95447: variable 'omit' from source: magic vars 18911 1727096314.95570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096314.95613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096314.95679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096314.95765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096314.95783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096314.95888: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096314.95898: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.95907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.96144: Set connection var ansible_shell_executable to /bin/sh 18911 1727096314.96208: Set connection var ansible_timeout to 10 18911 1727096314.96216: Set connection var ansible_shell_type to sh 18911 1727096314.96227: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096314.96236: Set connection var ansible_pipelining to False 18911 1727096314.96247: Set connection var ansible_connection to ssh 18911 1727096314.96526: variable 'ansible_shell_executable' from source: unknown 18911 1727096314.96530: variable 'ansible_connection' from source: unknown 18911 1727096314.96533: variable 'ansible_module_compression' from source: unknown 18911 1727096314.96535: variable 'ansible_shell_type' from source: unknown 18911 1727096314.96537: variable 'ansible_shell_executable' from source: unknown 18911 1727096314.96538: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096314.96540: variable 'ansible_pipelining' from source: unknown 18911 1727096314.96542: variable 'ansible_timeout' from source: unknown 18911 1727096314.96544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096314.96811: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096314.96864: variable 'omit' from source: magic vars 18911 1727096314.96876: starting attempt loop 18911 1727096314.96884: running the handler 18911 1727096314.96977: _low_level_execute_command(): starting 18911 1727096314.96992: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096314.98488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096314.98703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096314.98770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096315.00499: stdout chunk (state=3): >>>/root <<< 18911 1727096315.00628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096315.00691: stderr chunk (state=3): >>><<< 18911 1727096315.00743: stdout chunk (state=3): >>><<< 18911 1727096315.00951: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096315.00955: _low_level_execute_command(): starting 18911 1727096315.00958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366 `" && echo ansible-tmp-1727096315.0077033-20532-54635901034366="` echo /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366 `" ) && sleep 0' 18911 1727096315.02125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096315.02146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096315.02276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096315.02392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096315.02441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096315.02471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096315.02622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096315.04700: stdout chunk (state=3): >>>ansible-tmp-1727096315.0077033-20532-54635901034366=/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366 <<< 18911 1727096315.04911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096315.04914: stderr chunk (state=3): >>><<< 18911 1727096315.04978: stdout chunk (state=3): >>><<< 18911 1727096315.05187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096315.0077033-20532-54635901034366=/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096315.05191: variable 'ansible_module_compression' from source: unknown 18911 1727096315.05206: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18911 1727096315.05324: variable 'ansible_facts' from source: unknown 18911 1727096315.05632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py 18911 1727096315.05999: Sending initial data 18911 1727096315.06005: Sent initial data (161 bytes) 18911 1727096315.07197: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096315.07229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096315.07397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096315.09192: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096315.09276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096315.09308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpj6p1p9oa /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py <<< 18911 1727096315.09312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py" <<< 18911 1727096315.09435: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpj6p1p9oa" to remote "/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py" <<< 18911 1727096315.11485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096315.11488: stdout chunk (state=3): >>><<< 18911 1727096315.11675: stderr chunk (state=3): >>><<< 18911 1727096315.11689: done transferring module to remote 18911 1727096315.11794: _low_level_execute_command(): starting 18911 1727096315.11804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/ /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py && sleep 0' 18911 1727096315.12923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096315.13048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096315.13151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096315.13263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096315.13370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096315.15544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096315.15548: stdout chunk (state=3): >>><<< 18911 1727096315.15553: stderr chunk (state=3): >>><<< 18911 1727096315.15575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096315.15583: _low_level_execute_command(): starting 18911 1727096315.15585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/AnsiballZ_service_facts.py && sleep 0' 18911 1727096315.16869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096315.16879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096315.16885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096315.16904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096315.16909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096315.17022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096315.17026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096315.17028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096315.17082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096315.17166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096316.75537: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 18911 1727096316.75557: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 18911 1727096316.75616: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "<<< 18911 1727096316.75632: stdout chunk (state=3): >>>systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18911 1727096316.77097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096316.77126: stderr chunk (state=3): >>><<< 18911 1727096316.77130: stdout chunk (state=3): >>><<< 18911 1727096316.77157: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096316.77601: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096316.77610: _low_level_execute_command(): starting 18911 1727096316.77615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096315.0077033-20532-54635901034366/ > /dev/null 2>&1 && sleep 0' 18911 1727096316.78167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096316.78172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096316.78175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096316.78177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096316.78271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096316.78274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096316.78350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096316.80206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096316.80226: stderr chunk (state=3): >>><<< 18911 1727096316.80229: stdout chunk (state=3): >>><<< 18911 1727096316.80242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096316.80248: handler run complete 18911 1727096316.80372: variable 'ansible_facts' from source: unknown 18911 1727096316.80466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096316.81079: variable 'ansible_facts' from source: unknown 18911 1727096316.81082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096316.81272: attempt loop complete, returning result 18911 1727096316.81279: _execute() done 18911 1727096316.81282: dumping result to json 18911 1727096316.81283: done dumping result, returning 18911 1727096316.81287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-09a7-aae1-00000000045d] 18911 1727096316.81290: sending task result for task 0afff68d-5257-09a7-aae1-00000000045d ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096316.82373: no more pending results, returning what we have 18911 1727096316.82376: results queue empty 18911 1727096316.82377: checking for any_errors_fatal 18911 1727096316.82381: done checking for any_errors_fatal 18911 1727096316.82382: checking for max_fail_percentage 18911 1727096316.82383: done checking for max_fail_percentage 18911 1727096316.82384: checking to see if all hosts have failed and the running result is not ok 18911 1727096316.82385: done checking to see if all hosts have failed 18911 1727096316.82385: getting the remaining hosts for this loop 18911 1727096316.82387: done getting the remaining hosts for this loop 18911 1727096316.82389: getting the next task for host managed_node1 18911 1727096316.82395: done getting next task for host managed_node1 18911 1727096316.82399: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096316.82401: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096316.82409: getting variables 18911 1727096316.82411: in VariableManager get_vars() 18911 1727096316.82440: Calling all_inventory to load vars for managed_node1 18911 1727096316.82443: Calling groups_inventory to load vars for managed_node1 18911 1727096316.82445: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096316.82453: Calling all_plugins_play to load vars for managed_node1 18911 1727096316.82456: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096316.82459: Calling groups_plugins_play to load vars for managed_node1 18911 1727096316.82997: done sending task result for task 0afff68d-5257-09a7-aae1-00000000045d 18911 1727096316.83001: WORKER PROCESS EXITING 18911 1727096316.84473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096316.86816: done with get_vars() 18911 1727096316.86878: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:58:36 -0400 (0:00:01.947) 0:00:35.985 ****** 18911 1727096316.87103: entering _queue_task() for managed_node1/package_facts 18911 1727096316.87787: worker is 1 (out of 1 available) 18911 1727096316.87801: exiting _queue_task() for managed_node1/package_facts 18911 1727096316.87819: done queuing things up, now waiting for results queue to drain 18911 1727096316.87821: waiting for pending results... 18911 1727096316.88368: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18911 1727096316.88767: in run() - task 0afff68d-5257-09a7-aae1-00000000045e 18911 1727096316.88773: variable 'ansible_search_path' from source: unknown 18911 1727096316.88777: variable 'ansible_search_path' from source: unknown 18911 1727096316.88818: calling self._execute() 18911 1727096316.89010: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096316.89030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096316.89071: variable 'omit' from source: magic vars 18911 1727096316.89502: variable 'ansible_distribution_major_version' from source: facts 18911 1727096316.89515: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096316.89576: variable 'omit' from source: magic vars 18911 1727096316.89599: variable 'omit' from source: magic vars 18911 1727096316.89647: variable 'omit' from source: magic vars 18911 1727096316.89720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096316.89751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096316.89829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096316.89833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096316.89835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096316.89871: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096316.89882: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096316.89891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096316.90050: Set connection var ansible_shell_executable to /bin/sh 18911 1727096316.90066: Set connection var ansible_timeout to 10 18911 1727096316.90118: Set connection var ansible_shell_type to sh 18911 1727096316.90121: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096316.90123: Set connection var ansible_pipelining to False 18911 1727096316.90125: Set connection var ansible_connection to ssh 18911 1727096316.90130: variable 'ansible_shell_executable' from source: unknown 18911 1727096316.90139: variable 'ansible_connection' from source: unknown 18911 1727096316.90152: variable 'ansible_module_compression' from source: unknown 18911 1727096316.90165: variable 'ansible_shell_type' from source: unknown 18911 1727096316.90225: variable 'ansible_shell_executable' from source: unknown 18911 1727096316.90228: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096316.90231: variable 'ansible_pipelining' from source: unknown 18911 1727096316.90233: variable 'ansible_timeout' from source: unknown 18911 1727096316.90234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096316.90440: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096316.90649: variable 'omit' from source: magic vars 18911 1727096316.90652: starting attempt loop 18911 1727096316.90654: running the handler 18911 1727096316.90656: _low_level_execute_command(): starting 18911 1727096316.90658: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096316.91566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096316.91581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096316.91682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096316.93387: stdout chunk (state=3): >>>/root <<< 18911 1727096316.93579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096316.93582: stdout chunk (state=3): >>><<< 18911 1727096316.93584: stderr chunk (state=3): >>><<< 18911 1727096316.93588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096316.93600: _low_level_execute_command(): starting 18911 1727096316.93608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705 `" && echo ansible-tmp-1727096316.935844-20616-46131813664705="` echo /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705 `" ) && sleep 0' 18911 1727096316.94456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096316.94566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096316.94600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096316.94698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096316.96700: stdout chunk (state=3): >>>ansible-tmp-1727096316.935844-20616-46131813664705=/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705 <<< 18911 1727096316.96832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096316.96835: stdout chunk (state=3): >>><<< 18911 1727096316.96838: stderr chunk (state=3): >>><<< 18911 1727096316.97073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096316.935844-20616-46131813664705=/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096316.97077: variable 'ansible_module_compression' from source: unknown 18911 1727096316.97079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18911 1727096316.97082: variable 'ansible_facts' from source: unknown 18911 1727096316.97363: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py 18911 1727096316.97683: Sending initial data 18911 1727096316.97710: Sent initial data (160 bytes) 18911 1727096316.98679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096316.98752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096317.00435: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096317.00492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096317.00567: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmplxwd793_ /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py <<< 18911 1727096317.00572: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py" <<< 18911 1727096317.00650: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmplxwd793_" to remote "/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py" <<< 18911 1727096317.02550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096317.02640: stderr chunk (state=3): >>><<< 18911 1727096317.02652: stdout chunk (state=3): >>><<< 18911 1727096317.02727: done transferring module to remote 18911 1727096317.02731: _low_level_execute_command(): starting 18911 1727096317.02733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/ /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py && sleep 0' 18911 1727096317.03823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096317.03829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096317.03956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096317.03993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096317.04044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096317.04090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096317.06176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096317.06180: stdout chunk (state=3): >>><<< 18911 1727096317.06182: stderr chunk (state=3): >>><<< 18911 1727096317.06184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096317.06192: _low_level_execute_command(): starting 18911 1727096317.06195: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/AnsiballZ_package_facts.py && sleep 0' 18911 1727096317.06926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096317.06944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096317.07023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096317.07087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096317.07166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096317.07177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096317.07205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096317.07324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096317.51622: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 18911 1727096317.51656: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 18911 1727096317.51661: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18911 1727096317.51682: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 18911 1727096317.51696: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 18911 1727096317.51747: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 18911 1727096317.51759: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 18911 1727096317.51785: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 18911 1727096317.51789: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18911 1727096317.53598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096317.53602: stderr chunk (state=3): >>><<< 18911 1727096317.53604: stdout chunk (state=3): >>><<< 18911 1727096317.53651: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096317.54935: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096317.54950: _low_level_execute_command(): starting 18911 1727096317.54955: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096316.935844-20616-46131813664705/ > /dev/null 2>&1 && sleep 0' 18911 1727096317.55622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096317.55626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096317.55628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096317.55638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096317.55729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096317.57623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096317.57680: stderr chunk (state=3): >>><<< 18911 1727096317.57684: stdout chunk (state=3): >>><<< 18911 1727096317.57728: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096317.57731: handler run complete 18911 1727096317.65061: variable 'ansible_facts' from source: unknown 18911 1727096317.65308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.66382: variable 'ansible_facts' from source: unknown 18911 1727096317.66624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.67014: attempt loop complete, returning result 18911 1727096317.67023: _execute() done 18911 1727096317.67026: dumping result to json 18911 1727096317.67144: done dumping result, returning 18911 1727096317.67151: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-09a7-aae1-00000000045e] 18911 1727096317.67154: sending task result for task 0afff68d-5257-09a7-aae1-00000000045e 18911 1727096317.73448: done sending task result for task 0afff68d-5257-09a7-aae1-00000000045e ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096317.73498: WORKER PROCESS EXITING 18911 1727096317.73506: no more pending results, returning what we have 18911 1727096317.73508: results queue empty 18911 1727096317.73508: checking for any_errors_fatal 18911 1727096317.73511: done checking for any_errors_fatal 18911 1727096317.73511: checking for max_fail_percentage 18911 1727096317.73512: done checking for max_fail_percentage 18911 1727096317.73512: checking to see if all hosts have failed and the running result is not ok 18911 1727096317.73513: done checking to see if all hosts have failed 18911 1727096317.73513: getting the remaining hosts for this loop 18911 1727096317.73514: done getting the remaining hosts for this loop 18911 1727096317.73516: getting the next task for host managed_node1 18911 1727096317.73519: done getting next task for host managed_node1 18911 1727096317.73521: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096317.73522: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096317.73528: getting variables 18911 1727096317.73528: in VariableManager get_vars() 18911 1727096317.73543: Calling all_inventory to load vars for managed_node1 18911 1727096317.73545: Calling groups_inventory to load vars for managed_node1 18911 1727096317.73546: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096317.73551: Calling all_plugins_play to load vars for managed_node1 18911 1727096317.73552: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096317.73554: Calling groups_plugins_play to load vars for managed_node1 18911 1727096317.74250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.75383: done with get_vars() 18911 1727096317.75409: done getting variables 18911 1727096317.75461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:58:37 -0400 (0:00:00.883) 0:00:36.869 ****** 18911 1727096317.75487: entering _queue_task() for managed_node1/debug 18911 1727096317.75785: worker is 1 (out of 1 available) 18911 1727096317.75797: exiting _queue_task() for managed_node1/debug 18911 1727096317.75808: done queuing things up, now waiting for results queue to drain 18911 1727096317.75810: waiting for pending results... 18911 1727096317.76054: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18911 1727096317.76143: in run() - task 0afff68d-5257-09a7-aae1-00000000005d 18911 1727096317.76153: variable 'ansible_search_path' from source: unknown 18911 1727096317.76156: variable 'ansible_search_path' from source: unknown 18911 1727096317.76190: calling self._execute() 18911 1727096317.76266: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.76275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.76286: variable 'omit' from source: magic vars 18911 1727096317.76593: variable 'ansible_distribution_major_version' from source: facts 18911 1727096317.76603: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096317.76609: variable 'omit' from source: magic vars 18911 1727096317.76636: variable 'omit' from source: magic vars 18911 1727096317.76712: variable 'network_provider' from source: set_fact 18911 1727096317.76727: variable 'omit' from source: magic vars 18911 1727096317.76762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096317.76799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096317.76815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096317.76829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096317.76837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096317.76861: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096317.76864: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.76872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.76945: Set connection var ansible_shell_executable to /bin/sh 18911 1727096317.76949: Set connection var ansible_timeout to 10 18911 1727096317.76952: Set connection var ansible_shell_type to sh 18911 1727096317.76958: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096317.76963: Set connection var ansible_pipelining to False 18911 1727096317.76972: Set connection var ansible_connection to ssh 18911 1727096317.76995: variable 'ansible_shell_executable' from source: unknown 18911 1727096317.76998: variable 'ansible_connection' from source: unknown 18911 1727096317.77001: variable 'ansible_module_compression' from source: unknown 18911 1727096317.77003: variable 'ansible_shell_type' from source: unknown 18911 1727096317.77006: variable 'ansible_shell_executable' from source: unknown 18911 1727096317.77009: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.77012: variable 'ansible_pipelining' from source: unknown 18911 1727096317.77014: variable 'ansible_timeout' from source: unknown 18911 1727096317.77017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.77118: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096317.77134: variable 'omit' from source: magic vars 18911 1727096317.77137: starting attempt loop 18911 1727096317.77139: running the handler 18911 1727096317.77177: handler run complete 18911 1727096317.77188: attempt loop complete, returning result 18911 1727096317.77190: _execute() done 18911 1727096317.77193: dumping result to json 18911 1727096317.77196: done dumping result, returning 18911 1727096317.77205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-09a7-aae1-00000000005d] 18911 1727096317.77207: sending task result for task 0afff68d-5257-09a7-aae1-00000000005d 18911 1727096317.77298: done sending task result for task 0afff68d-5257-09a7-aae1-00000000005d 18911 1727096317.77301: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 18911 1727096317.77360: no more pending results, returning what we have 18911 1727096317.77363: results queue empty 18911 1727096317.77363: checking for any_errors_fatal 18911 1727096317.77383: done checking for any_errors_fatal 18911 1727096317.77384: checking for max_fail_percentage 18911 1727096317.77386: done checking for max_fail_percentage 18911 1727096317.77387: checking to see if all hosts have failed and the running result is not ok 18911 1727096317.77388: done checking to see if all hosts have failed 18911 1727096317.77388: getting the remaining hosts for this loop 18911 1727096317.77390: done getting the remaining hosts for this loop 18911 1727096317.77394: getting the next task for host managed_node1 18911 1727096317.77400: done getting next task for host managed_node1 18911 1727096317.77405: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096317.77406: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096317.77418: getting variables 18911 1727096317.77419: in VariableManager get_vars() 18911 1727096317.77455: Calling all_inventory to load vars for managed_node1 18911 1727096317.77458: Calling groups_inventory to load vars for managed_node1 18911 1727096317.77460: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096317.77480: Calling all_plugins_play to load vars for managed_node1 18911 1727096317.77483: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096317.77486: Calling groups_plugins_play to load vars for managed_node1 18911 1727096317.78614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.79737: done with get_vars() 18911 1727096317.79758: done getting variables 18911 1727096317.79821: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:58:37 -0400 (0:00:00.043) 0:00:36.912 ****** 18911 1727096317.79846: entering _queue_task() for managed_node1/fail 18911 1727096317.80174: worker is 1 (out of 1 available) 18911 1727096317.80188: exiting _queue_task() for managed_node1/fail 18911 1727096317.80199: done queuing things up, now waiting for results queue to drain 18911 1727096317.80208: waiting for pending results... 18911 1727096317.80433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18911 1727096317.80531: in run() - task 0afff68d-5257-09a7-aae1-00000000005e 18911 1727096317.80537: variable 'ansible_search_path' from source: unknown 18911 1727096317.80541: variable 'ansible_search_path' from source: unknown 18911 1727096317.80597: calling self._execute() 18911 1727096317.80698: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.80702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.80718: variable 'omit' from source: magic vars 18911 1727096317.81057: variable 'ansible_distribution_major_version' from source: facts 18911 1727096317.81069: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096317.81205: variable 'network_state' from source: role '' defaults 18911 1727096317.81208: Evaluated conditional (network_state != {}): False 18911 1727096317.81211: when evaluation is False, skipping this task 18911 1727096317.81215: _execute() done 18911 1727096317.81226: dumping result to json 18911 1727096317.81230: done dumping result, returning 18911 1727096317.81233: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-09a7-aae1-00000000005e] 18911 1727096317.81236: sending task result for task 0afff68d-5257-09a7-aae1-00000000005e 18911 1727096317.81387: done sending task result for task 0afff68d-5257-09a7-aae1-00000000005e 18911 1727096317.81390: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096317.81451: no more pending results, returning what we have 18911 1727096317.81455: results queue empty 18911 1727096317.81456: checking for any_errors_fatal 18911 1727096317.81463: done checking for any_errors_fatal 18911 1727096317.81463: checking for max_fail_percentage 18911 1727096317.81465: done checking for max_fail_percentage 18911 1727096317.81465: checking to see if all hosts have failed and the running result is not ok 18911 1727096317.81466: done checking to see if all hosts have failed 18911 1727096317.81468: getting the remaining hosts for this loop 18911 1727096317.81470: done getting the remaining hosts for this loop 18911 1727096317.81473: getting the next task for host managed_node1 18911 1727096317.81479: done getting next task for host managed_node1 18911 1727096317.81483: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096317.81485: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096317.81498: getting variables 18911 1727096317.81500: in VariableManager get_vars() 18911 1727096317.81531: Calling all_inventory to load vars for managed_node1 18911 1727096317.81534: Calling groups_inventory to load vars for managed_node1 18911 1727096317.81536: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096317.81546: Calling all_plugins_play to load vars for managed_node1 18911 1727096317.81548: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096317.81550: Calling groups_plugins_play to load vars for managed_node1 18911 1727096317.82476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.83902: done with get_vars() 18911 1727096317.83924: done getting variables 18911 1727096317.83986: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:58:37 -0400 (0:00:00.041) 0:00:36.954 ****** 18911 1727096317.84024: entering _queue_task() for managed_node1/fail 18911 1727096317.84440: worker is 1 (out of 1 available) 18911 1727096317.84454: exiting _queue_task() for managed_node1/fail 18911 1727096317.84465: done queuing things up, now waiting for results queue to drain 18911 1727096317.84467: waiting for pending results... 18911 1727096317.84797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18911 1727096317.85066: in run() - task 0afff68d-5257-09a7-aae1-00000000005f 18911 1727096317.85075: variable 'ansible_search_path' from source: unknown 18911 1727096317.85078: variable 'ansible_search_path' from source: unknown 18911 1727096317.85082: calling self._execute() 18911 1727096317.85129: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.85136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.85154: variable 'omit' from source: magic vars 18911 1727096317.85548: variable 'ansible_distribution_major_version' from source: facts 18911 1727096317.85562: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096317.85646: variable 'network_state' from source: role '' defaults 18911 1727096317.85656: Evaluated conditional (network_state != {}): False 18911 1727096317.85660: when evaluation is False, skipping this task 18911 1727096317.85663: _execute() done 18911 1727096317.85666: dumping result to json 18911 1727096317.85670: done dumping result, returning 18911 1727096317.85683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-09a7-aae1-00000000005f] 18911 1727096317.85686: sending task result for task 0afff68d-5257-09a7-aae1-00000000005f 18911 1727096317.85771: done sending task result for task 0afff68d-5257-09a7-aae1-00000000005f 18911 1727096317.85774: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096317.85830: no more pending results, returning what we have 18911 1727096317.85833: results queue empty 18911 1727096317.85835: checking for any_errors_fatal 18911 1727096317.85842: done checking for any_errors_fatal 18911 1727096317.85843: checking for max_fail_percentage 18911 1727096317.85845: done checking for max_fail_percentage 18911 1727096317.85845: checking to see if all hosts have failed and the running result is not ok 18911 1727096317.85846: done checking to see if all hosts have failed 18911 1727096317.85846: getting the remaining hosts for this loop 18911 1727096317.85848: done getting the remaining hosts for this loop 18911 1727096317.85851: getting the next task for host managed_node1 18911 1727096317.85858: done getting next task for host managed_node1 18911 1727096317.85861: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096317.85863: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096317.85879: getting variables 18911 1727096317.85881: in VariableManager get_vars() 18911 1727096317.85922: Calling all_inventory to load vars for managed_node1 18911 1727096317.85926: Calling groups_inventory to load vars for managed_node1 18911 1727096317.85928: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096317.85938: Calling all_plugins_play to load vars for managed_node1 18911 1727096317.85940: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096317.85942: Calling groups_plugins_play to load vars for managed_node1 18911 1727096317.87581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.88482: done with get_vars() 18911 1727096317.88503: done getting variables 18911 1727096317.88548: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:58:37 -0400 (0:00:00.045) 0:00:37.000 ****** 18911 1727096317.88574: entering _queue_task() for managed_node1/fail 18911 1727096317.88830: worker is 1 (out of 1 available) 18911 1727096317.88842: exiting _queue_task() for managed_node1/fail 18911 1727096317.88853: done queuing things up, now waiting for results queue to drain 18911 1727096317.88855: waiting for pending results... 18911 1727096317.89044: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18911 1727096317.89126: in run() - task 0afff68d-5257-09a7-aae1-000000000060 18911 1727096317.89136: variable 'ansible_search_path' from source: unknown 18911 1727096317.89140: variable 'ansible_search_path' from source: unknown 18911 1727096317.89173: calling self._execute() 18911 1727096317.89251: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096317.89256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096317.89264: variable 'omit' from source: magic vars 18911 1727096317.89551: variable 'ansible_distribution_major_version' from source: facts 18911 1727096317.89561: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096317.89687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096317.91777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096317.91854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096317.91919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096317.91964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096317.92032: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096317.92156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096317.92651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096317.92665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096317.92722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096317.92744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096317.92878: variable 'ansible_distribution_major_version' from source: facts 18911 1727096317.92909: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18911 1727096317.93048: variable 'ansible_distribution' from source: facts 18911 1727096317.93090: variable '__network_rh_distros' from source: role '' defaults 18911 1727096317.93093: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18911 1727096317.93449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096317.93488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096317.93546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096317.93579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096317.93637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096317.93695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096317.93747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096317.93802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096317.93874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096317.93903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096317.93994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096317.94024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096317.94054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096317.94194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096317.94197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096317.94692: variable 'network_connections' from source: play vars 18911 1727096317.94709: variable 'profile' from source: play vars 18911 1727096317.94799: variable 'profile' from source: play vars 18911 1727096317.94846: variable 'interface' from source: set_fact 18911 1727096317.94883: variable 'interface' from source: set_fact 18911 1727096317.94901: variable 'network_state' from source: role '' defaults 18911 1727096317.94988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096317.95178: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096317.95216: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096317.95273: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096317.95293: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096317.95353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096317.95456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096317.95460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096317.95463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096317.95496: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18911 1727096317.95510: when evaluation is False, skipping this task 18911 1727096317.95565: _execute() done 18911 1727096317.95571: dumping result to json 18911 1727096317.95574: done dumping result, returning 18911 1727096317.95576: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-09a7-aae1-000000000060] 18911 1727096317.95579: sending task result for task 0afff68d-5257-09a7-aae1-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18911 1727096317.95831: no more pending results, returning what we have 18911 1727096317.95835: results queue empty 18911 1727096317.95837: checking for any_errors_fatal 18911 1727096317.95844: done checking for any_errors_fatal 18911 1727096317.95845: checking for max_fail_percentage 18911 1727096317.95847: done checking for max_fail_percentage 18911 1727096317.95848: checking to see if all hosts have failed and the running result is not ok 18911 1727096317.95849: done checking to see if all hosts have failed 18911 1727096317.95850: getting the remaining hosts for this loop 18911 1727096317.95851: done getting the remaining hosts for this loop 18911 1727096317.95855: getting the next task for host managed_node1 18911 1727096317.95861: done getting next task for host managed_node1 18911 1727096317.95866: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096317.95870: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096317.95883: getting variables 18911 1727096317.95887: in VariableManager get_vars() 18911 1727096317.95930: Calling all_inventory to load vars for managed_node1 18911 1727096317.95933: Calling groups_inventory to load vars for managed_node1 18911 1727096317.95936: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096317.95947: Calling all_plugins_play to load vars for managed_node1 18911 1727096317.95951: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096317.95954: Calling groups_plugins_play to load vars for managed_node1 18911 1727096317.96535: done sending task result for task 0afff68d-5257-09a7-aae1-000000000060 18911 1727096317.96538: WORKER PROCESS EXITING 18911 1727096317.97964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096317.99650: done with get_vars() 18911 1727096317.99693: done getting variables 18911 1727096317.99754: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:58:37 -0400 (0:00:00.112) 0:00:37.112 ****** 18911 1727096317.99796: entering _queue_task() for managed_node1/dnf 18911 1727096318.00323: worker is 1 (out of 1 available) 18911 1727096318.00333: exiting _queue_task() for managed_node1/dnf 18911 1727096318.00343: done queuing things up, now waiting for results queue to drain 18911 1727096318.00344: waiting for pending results... 18911 1727096318.00486: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18911 1727096318.00614: in run() - task 0afff68d-5257-09a7-aae1-000000000061 18911 1727096318.00632: variable 'ansible_search_path' from source: unknown 18911 1727096318.00641: variable 'ansible_search_path' from source: unknown 18911 1727096318.00693: calling self._execute() 18911 1727096318.00869: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.00874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.00878: variable 'omit' from source: magic vars 18911 1727096318.01241: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.01260: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.01481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.03871: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.03958: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.04004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.04179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.04182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.04186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.04208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.04238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.04295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.04317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.04440: variable 'ansible_distribution' from source: facts 18911 1727096318.04449: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.04466: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18911 1727096318.04587: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.04745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.04778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.04809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.04864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.04888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.04941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.04973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.05002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.05055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.05077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.05122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.05158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.05191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.05235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.05255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.05438: variable 'network_connections' from source: play vars 18911 1727096318.05487: variable 'profile' from source: play vars 18911 1727096318.05540: variable 'profile' from source: play vars 18911 1727096318.05550: variable 'interface' from source: set_fact 18911 1727096318.05625: variable 'interface' from source: set_fact 18911 1727096318.05772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096318.05900: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096318.05954: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096318.05993: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096318.06037: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096318.06086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096318.06110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096318.06151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.06181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096318.06251: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096318.06496: variable 'network_connections' from source: play vars 18911 1727096318.06506: variable 'profile' from source: play vars 18911 1727096318.06578: variable 'profile' from source: play vars 18911 1727096318.06581: variable 'interface' from source: set_fact 18911 1727096318.06687: variable 'interface' from source: set_fact 18911 1727096318.06690: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096318.06693: when evaluation is False, skipping this task 18911 1727096318.06695: _execute() done 18911 1727096318.06697: dumping result to json 18911 1727096318.06700: done dumping result, returning 18911 1727096318.06711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000061] 18911 1727096318.06721: sending task result for task 0afff68d-5257-09a7-aae1-000000000061 18911 1727096318.06917: done sending task result for task 0afff68d-5257-09a7-aae1-000000000061 18911 1727096318.06920: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096318.07171: no more pending results, returning what we have 18911 1727096318.07175: results queue empty 18911 1727096318.07176: checking for any_errors_fatal 18911 1727096318.07183: done checking for any_errors_fatal 18911 1727096318.07184: checking for max_fail_percentage 18911 1727096318.07186: done checking for max_fail_percentage 18911 1727096318.07186: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.07187: done checking to see if all hosts have failed 18911 1727096318.07188: getting the remaining hosts for this loop 18911 1727096318.07189: done getting the remaining hosts for this loop 18911 1727096318.07192: getting the next task for host managed_node1 18911 1727096318.07198: done getting next task for host managed_node1 18911 1727096318.07203: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096318.07205: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.07219: getting variables 18911 1727096318.07220: in VariableManager get_vars() 18911 1727096318.07262: Calling all_inventory to load vars for managed_node1 18911 1727096318.07265: Calling groups_inventory to load vars for managed_node1 18911 1727096318.07274: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.07287: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.07290: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.07293: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.08813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.10541: done with get_vars() 18911 1727096318.10580: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18911 1727096318.10665: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:58:38 -0400 (0:00:00.109) 0:00:37.221 ****** 18911 1727096318.10699: entering _queue_task() for managed_node1/yum 18911 1727096318.11277: worker is 1 (out of 1 available) 18911 1727096318.11286: exiting _queue_task() for managed_node1/yum 18911 1727096318.11295: done queuing things up, now waiting for results queue to drain 18911 1727096318.11297: waiting for pending results... 18911 1727096318.11565: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18911 1727096318.11573: in run() - task 0afff68d-5257-09a7-aae1-000000000062 18911 1727096318.11577: variable 'ansible_search_path' from source: unknown 18911 1727096318.11580: variable 'ansible_search_path' from source: unknown 18911 1727096318.11583: calling self._execute() 18911 1727096318.11769: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.11776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.11779: variable 'omit' from source: magic vars 18911 1727096318.12056: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.12076: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.12248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.15222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.15327: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.15335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.15400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.15403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.15493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.15526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.15578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.15596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.15614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.15723: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.15740: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18911 1727096318.15744: when evaluation is False, skipping this task 18911 1727096318.15747: _execute() done 18911 1727096318.15749: dumping result to json 18911 1727096318.15752: done dumping result, returning 18911 1727096318.15761: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000062] 18911 1727096318.15764: sending task result for task 0afff68d-5257-09a7-aae1-000000000062 18911 1727096318.15868: done sending task result for task 0afff68d-5257-09a7-aae1-000000000062 18911 1727096318.15872: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18911 1727096318.15926: no more pending results, returning what we have 18911 1727096318.15930: results queue empty 18911 1727096318.15931: checking for any_errors_fatal 18911 1727096318.15937: done checking for any_errors_fatal 18911 1727096318.15938: checking for max_fail_percentage 18911 1727096318.15940: done checking for max_fail_percentage 18911 1727096318.15940: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.15941: done checking to see if all hosts have failed 18911 1727096318.15942: getting the remaining hosts for this loop 18911 1727096318.15944: done getting the remaining hosts for this loop 18911 1727096318.15948: getting the next task for host managed_node1 18911 1727096318.15955: done getting next task for host managed_node1 18911 1727096318.15959: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096318.15962: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.15980: getting variables 18911 1727096318.15982: in VariableManager get_vars() 18911 1727096318.16026: Calling all_inventory to load vars for managed_node1 18911 1727096318.16029: Calling groups_inventory to load vars for managed_node1 18911 1727096318.16032: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.16044: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.16047: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.16050: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.17839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.19457: done with get_vars() 18911 1727096318.19494: done getting variables 18911 1727096318.19546: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:58:38 -0400 (0:00:00.088) 0:00:37.310 ****** 18911 1727096318.19580: entering _queue_task() for managed_node1/fail 18911 1727096318.19941: worker is 1 (out of 1 available) 18911 1727096318.19953: exiting _queue_task() for managed_node1/fail 18911 1727096318.20171: done queuing things up, now waiting for results queue to drain 18911 1727096318.20173: waiting for pending results... 18911 1727096318.20334: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18911 1727096318.20432: in run() - task 0afff68d-5257-09a7-aae1-000000000063 18911 1727096318.20436: variable 'ansible_search_path' from source: unknown 18911 1727096318.20438: variable 'ansible_search_path' from source: unknown 18911 1727096318.20441: calling self._execute() 18911 1727096318.20524: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.20527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.20538: variable 'omit' from source: magic vars 18911 1727096318.20957: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.20976: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.21106: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.21313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.23599: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.23663: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.23703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.23736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.23759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.23844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.23898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.23924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.23964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.24019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.24033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.24056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.24086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.24131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.24232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.24235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.24239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.24242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.24275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.24291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.24488: variable 'network_connections' from source: play vars 18911 1727096318.24501: variable 'profile' from source: play vars 18911 1727096318.24672: variable 'profile' from source: play vars 18911 1727096318.24675: variable 'interface' from source: set_fact 18911 1727096318.24679: variable 'interface' from source: set_fact 18911 1727096318.24728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096318.24912: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096318.24944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096318.24984: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096318.25019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096318.25055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096318.25081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096318.25111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.25135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096318.25205: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096318.25467: variable 'network_connections' from source: play vars 18911 1727096318.25472: variable 'profile' from source: play vars 18911 1727096318.25509: variable 'profile' from source: play vars 18911 1727096318.25512: variable 'interface' from source: set_fact 18911 1727096318.25580: variable 'interface' from source: set_fact 18911 1727096318.25607: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096318.25610: when evaluation is False, skipping this task 18911 1727096318.25614: _execute() done 18911 1727096318.25616: dumping result to json 18911 1727096318.25618: done dumping result, returning 18911 1727096318.25634: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000063] 18911 1727096318.25644: sending task result for task 0afff68d-5257-09a7-aae1-000000000063 18911 1727096318.25866: done sending task result for task 0afff68d-5257-09a7-aae1-000000000063 18911 1727096318.25870: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096318.26017: no more pending results, returning what we have 18911 1727096318.26020: results queue empty 18911 1727096318.26021: checking for any_errors_fatal 18911 1727096318.26027: done checking for any_errors_fatal 18911 1727096318.26027: checking for max_fail_percentage 18911 1727096318.26030: done checking for max_fail_percentage 18911 1727096318.26030: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.26031: done checking to see if all hosts have failed 18911 1727096318.26032: getting the remaining hosts for this loop 18911 1727096318.26033: done getting the remaining hosts for this loop 18911 1727096318.26036: getting the next task for host managed_node1 18911 1727096318.26042: done getting next task for host managed_node1 18911 1727096318.26046: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18911 1727096318.26048: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.26060: getting variables 18911 1727096318.26062: in VariableManager get_vars() 18911 1727096318.26103: Calling all_inventory to load vars for managed_node1 18911 1727096318.26106: Calling groups_inventory to load vars for managed_node1 18911 1727096318.26109: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.26119: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.26122: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.26125: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.27351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.28398: done with get_vars() 18911 1727096318.28416: done getting variables 18911 1727096318.28462: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:58:38 -0400 (0:00:00.089) 0:00:37.399 ****** 18911 1727096318.28491: entering _queue_task() for managed_node1/package 18911 1727096318.28761: worker is 1 (out of 1 available) 18911 1727096318.28779: exiting _queue_task() for managed_node1/package 18911 1727096318.28790: done queuing things up, now waiting for results queue to drain 18911 1727096318.28792: waiting for pending results... 18911 1727096318.28978: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18911 1727096318.29054: in run() - task 0afff68d-5257-09a7-aae1-000000000064 18911 1727096318.29068: variable 'ansible_search_path' from source: unknown 18911 1727096318.29073: variable 'ansible_search_path' from source: unknown 18911 1727096318.29101: calling self._execute() 18911 1727096318.29182: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.29186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.29195: variable 'omit' from source: magic vars 18911 1727096318.29498: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.29529: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.29766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096318.30070: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096318.30134: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096318.30176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096318.30276: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096318.30426: variable 'network_packages' from source: role '' defaults 18911 1727096318.30569: variable '__network_provider_setup' from source: role '' defaults 18911 1727096318.30573: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096318.30633: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096318.30637: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096318.30682: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096318.30803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.32153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.32199: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.32227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.32250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.32275: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.32334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.32354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.32377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.32406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.32417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.32449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.32470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.32489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.32517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.32528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.32679: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096318.32757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.32776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.32793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.32824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.32833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.32896: variable 'ansible_python' from source: facts 18911 1727096318.32919: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096318.32979: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096318.33036: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096318.33132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.33150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.33170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.33196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.33207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.33238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.33258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.33280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.33330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.33333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.33432: variable 'network_connections' from source: play vars 18911 1727096318.33442: variable 'profile' from source: play vars 18911 1727096318.33581: variable 'profile' from source: play vars 18911 1727096318.33585: variable 'interface' from source: set_fact 18911 1727096318.33611: variable 'interface' from source: set_fact 18911 1727096318.33698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096318.33706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096318.33777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.33780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096318.33803: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.34072: variable 'network_connections' from source: play vars 18911 1727096318.34075: variable 'profile' from source: play vars 18911 1727096318.34172: variable 'profile' from source: play vars 18911 1727096318.34215: variable 'interface' from source: set_fact 18911 1727096318.34244: variable 'interface' from source: set_fact 18911 1727096318.34280: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096318.34355: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.34654: variable 'network_connections' from source: play vars 18911 1727096318.34657: variable 'profile' from source: play vars 18911 1727096318.34729: variable 'profile' from source: play vars 18911 1727096318.34732: variable 'interface' from source: set_fact 18911 1727096318.34826: variable 'interface' from source: set_fact 18911 1727096318.34850: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096318.34928: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096318.35273: variable 'network_connections' from source: play vars 18911 1727096318.35276: variable 'profile' from source: play vars 18911 1727096318.35282: variable 'profile' from source: play vars 18911 1727096318.35286: variable 'interface' from source: set_fact 18911 1727096318.35382: variable 'interface' from source: set_fact 18911 1727096318.35436: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096318.35496: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096318.35502: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096318.35558: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096318.35756: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096318.36248: variable 'network_connections' from source: play vars 18911 1727096318.36251: variable 'profile' from source: play vars 18911 1727096318.36293: variable 'profile' from source: play vars 18911 1727096318.36296: variable 'interface' from source: set_fact 18911 1727096318.36357: variable 'interface' from source: set_fact 18911 1727096318.36370: variable 'ansible_distribution' from source: facts 18911 1727096318.36606: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.36673: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.36677: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096318.36989: variable 'ansible_distribution' from source: facts 18911 1727096318.36993: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.36998: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.37012: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096318.37172: variable 'ansible_distribution' from source: facts 18911 1727096318.37446: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.37449: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.37451: variable 'network_provider' from source: set_fact 18911 1727096318.37453: variable 'ansible_facts' from source: unknown 18911 1727096318.38476: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18911 1727096318.38480: when evaluation is False, skipping this task 18911 1727096318.38483: _execute() done 18911 1727096318.38485: dumping result to json 18911 1727096318.38487: done dumping result, returning 18911 1727096318.38496: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-09a7-aae1-000000000064] 18911 1727096318.38498: sending task result for task 0afff68d-5257-09a7-aae1-000000000064 18911 1727096318.38712: done sending task result for task 0afff68d-5257-09a7-aae1-000000000064 18911 1727096318.38715: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18911 1727096318.38765: no more pending results, returning what we have 18911 1727096318.38770: results queue empty 18911 1727096318.38771: checking for any_errors_fatal 18911 1727096318.38777: done checking for any_errors_fatal 18911 1727096318.38778: checking for max_fail_percentage 18911 1727096318.38779: done checking for max_fail_percentage 18911 1727096318.38780: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.38781: done checking to see if all hosts have failed 18911 1727096318.38781: getting the remaining hosts for this loop 18911 1727096318.38782: done getting the remaining hosts for this loop 18911 1727096318.38785: getting the next task for host managed_node1 18911 1727096318.38790: done getting next task for host managed_node1 18911 1727096318.38794: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096318.38795: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.38808: getting variables 18911 1727096318.38809: in VariableManager get_vars() 18911 1727096318.38845: Calling all_inventory to load vars for managed_node1 18911 1727096318.38848: Calling groups_inventory to load vars for managed_node1 18911 1727096318.38850: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.38866: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.38871: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.38874: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.41205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.43715: done with get_vars() 18911 1727096318.43759: done getting variables 18911 1727096318.43820: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:58:38 -0400 (0:00:00.153) 0:00:37.553 ****** 18911 1727096318.43865: entering _queue_task() for managed_node1/package 18911 1727096318.44395: worker is 1 (out of 1 available) 18911 1727096318.44408: exiting _queue_task() for managed_node1/package 18911 1727096318.44420: done queuing things up, now waiting for results queue to drain 18911 1727096318.44421: waiting for pending results... 18911 1727096318.44946: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18911 1727096318.45151: in run() - task 0afff68d-5257-09a7-aae1-000000000065 18911 1727096318.45155: variable 'ansible_search_path' from source: unknown 18911 1727096318.45158: variable 'ansible_search_path' from source: unknown 18911 1727096318.45221: calling self._execute() 18911 1727096318.45403: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.45540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.45543: variable 'omit' from source: magic vars 18911 1727096318.46260: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.46285: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.46440: variable 'network_state' from source: role '' defaults 18911 1727096318.46468: Evaluated conditional (network_state != {}): False 18911 1727096318.46477: when evaluation is False, skipping this task 18911 1727096318.46484: _execute() done 18911 1727096318.46491: dumping result to json 18911 1727096318.46498: done dumping result, returning 18911 1727096318.46510: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000065] 18911 1727096318.46533: sending task result for task 0afff68d-5257-09a7-aae1-000000000065 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096318.46799: no more pending results, returning what we have 18911 1727096318.46804: results queue empty 18911 1727096318.46805: checking for any_errors_fatal 18911 1727096318.46812: done checking for any_errors_fatal 18911 1727096318.46813: checking for max_fail_percentage 18911 1727096318.46815: done checking for max_fail_percentage 18911 1727096318.46816: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.46816: done checking to see if all hosts have failed 18911 1727096318.46817: getting the remaining hosts for this loop 18911 1727096318.46819: done getting the remaining hosts for this loop 18911 1727096318.46822: getting the next task for host managed_node1 18911 1727096318.46830: done getting next task for host managed_node1 18911 1727096318.46834: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096318.46837: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.46989: getting variables 18911 1727096318.46991: in VariableManager get_vars() 18911 1727096318.47033: Calling all_inventory to load vars for managed_node1 18911 1727096318.47036: Calling groups_inventory to load vars for managed_node1 18911 1727096318.47039: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.47051: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.47055: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.47058: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.47695: done sending task result for task 0afff68d-5257-09a7-aae1-000000000065 18911 1727096318.47699: WORKER PROCESS EXITING 18911 1727096318.49900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.52047: done with get_vars() 18911 1727096318.52090: done getting variables 18911 1727096318.52160: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:58:38 -0400 (0:00:00.083) 0:00:37.636 ****** 18911 1727096318.52198: entering _queue_task() for managed_node1/package 18911 1727096318.52791: worker is 1 (out of 1 available) 18911 1727096318.52801: exiting _queue_task() for managed_node1/package 18911 1727096318.52812: done queuing things up, now waiting for results queue to drain 18911 1727096318.52814: waiting for pending results... 18911 1727096318.53082: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18911 1727096318.53212: in run() - task 0afff68d-5257-09a7-aae1-000000000066 18911 1727096318.53232: variable 'ansible_search_path' from source: unknown 18911 1727096318.53240: variable 'ansible_search_path' from source: unknown 18911 1727096318.53301: calling self._execute() 18911 1727096318.53428: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.53441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.53456: variable 'omit' from source: magic vars 18911 1727096318.53943: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.53947: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.54075: variable 'network_state' from source: role '' defaults 18911 1727096318.54095: Evaluated conditional (network_state != {}): False 18911 1727096318.54104: when evaluation is False, skipping this task 18911 1727096318.54113: _execute() done 18911 1727096318.54162: dumping result to json 18911 1727096318.54171: done dumping result, returning 18911 1727096318.54177: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-09a7-aae1-000000000066] 18911 1727096318.54184: sending task result for task 0afff68d-5257-09a7-aae1-000000000066 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096318.54453: no more pending results, returning what we have 18911 1727096318.54457: results queue empty 18911 1727096318.54458: checking for any_errors_fatal 18911 1727096318.54473: done checking for any_errors_fatal 18911 1727096318.54474: checking for max_fail_percentage 18911 1727096318.54476: done checking for max_fail_percentage 18911 1727096318.54477: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.54478: done checking to see if all hosts have failed 18911 1727096318.54479: getting the remaining hosts for this loop 18911 1727096318.54480: done getting the remaining hosts for this loop 18911 1727096318.54518: getting the next task for host managed_node1 18911 1727096318.54527: done getting next task for host managed_node1 18911 1727096318.54532: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096318.54535: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.54578: done sending task result for task 0afff68d-5257-09a7-aae1-000000000066 18911 1727096318.54581: WORKER PROCESS EXITING 18911 1727096318.54591: getting variables 18911 1727096318.54600: in VariableManager get_vars() 18911 1727096318.54726: Calling all_inventory to load vars for managed_node1 18911 1727096318.54736: Calling groups_inventory to load vars for managed_node1 18911 1727096318.54739: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.54750: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.54754: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.54757: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.57554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.60084: done with get_vars() 18911 1727096318.60120: done getting variables 18911 1727096318.60192: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:58:38 -0400 (0:00:00.080) 0:00:37.716 ****** 18911 1727096318.60224: entering _queue_task() for managed_node1/service 18911 1727096318.60619: worker is 1 (out of 1 available) 18911 1727096318.60631: exiting _queue_task() for managed_node1/service 18911 1727096318.60646: done queuing things up, now waiting for results queue to drain 18911 1727096318.60648: waiting for pending results... 18911 1727096318.60965: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18911 1727096318.61094: in run() - task 0afff68d-5257-09a7-aae1-000000000067 18911 1727096318.61098: variable 'ansible_search_path' from source: unknown 18911 1727096318.61100: variable 'ansible_search_path' from source: unknown 18911 1727096318.61125: calling self._execute() 18911 1727096318.61234: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.61238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.61247: variable 'omit' from source: magic vars 18911 1727096318.61641: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.61645: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.61755: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.61988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.64331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.64379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.64408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.64433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.64454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.64519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.64540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.64559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.64589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.64600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.64636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.64652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.64672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.64698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.64708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.64740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.64755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.64775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.64800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.64810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.64925: variable 'network_connections' from source: play vars 18911 1727096318.64937: variable 'profile' from source: play vars 18911 1727096318.64995: variable 'profile' from source: play vars 18911 1727096318.64999: variable 'interface' from source: set_fact 18911 1727096318.65043: variable 'interface' from source: set_fact 18911 1727096318.65098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096318.65324: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096318.65327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096318.65329: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096318.65346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096318.65379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096318.65480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096318.65483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.65485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096318.65537: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096318.65805: variable 'network_connections' from source: play vars 18911 1727096318.65828: variable 'profile' from source: play vars 18911 1727096318.65895: variable 'profile' from source: play vars 18911 1727096318.65904: variable 'interface' from source: set_fact 18911 1727096318.66043: variable 'interface' from source: set_fact 18911 1727096318.66046: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18911 1727096318.66048: when evaluation is False, skipping this task 18911 1727096318.66050: _execute() done 18911 1727096318.66053: dumping result to json 18911 1727096318.66055: done dumping result, returning 18911 1727096318.66056: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-09a7-aae1-000000000067] 18911 1727096318.66071: sending task result for task 0afff68d-5257-09a7-aae1-000000000067 18911 1727096318.66253: done sending task result for task 0afff68d-5257-09a7-aae1-000000000067 18911 1727096318.66257: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18911 1727096318.66312: no more pending results, returning what we have 18911 1727096318.66315: results queue empty 18911 1727096318.66316: checking for any_errors_fatal 18911 1727096318.66323: done checking for any_errors_fatal 18911 1727096318.66323: checking for max_fail_percentage 18911 1727096318.66325: done checking for max_fail_percentage 18911 1727096318.66325: checking to see if all hosts have failed and the running result is not ok 18911 1727096318.66326: done checking to see if all hosts have failed 18911 1727096318.66327: getting the remaining hosts for this loop 18911 1727096318.66328: done getting the remaining hosts for this loop 18911 1727096318.66332: getting the next task for host managed_node1 18911 1727096318.66338: done getting next task for host managed_node1 18911 1727096318.66342: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096318.66344: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096318.66356: getting variables 18911 1727096318.66358: in VariableManager get_vars() 18911 1727096318.66403: Calling all_inventory to load vars for managed_node1 18911 1727096318.66406: Calling groups_inventory to load vars for managed_node1 18911 1727096318.66409: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096318.66418: Calling all_plugins_play to load vars for managed_node1 18911 1727096318.66421: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096318.66423: Calling groups_plugins_play to load vars for managed_node1 18911 1727096318.67386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096318.68263: done with get_vars() 18911 1727096318.68284: done getting variables 18911 1727096318.68326: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:58:38 -0400 (0:00:00.081) 0:00:37.797 ****** 18911 1727096318.68347: entering _queue_task() for managed_node1/service 18911 1727096318.68606: worker is 1 (out of 1 available) 18911 1727096318.68619: exiting _queue_task() for managed_node1/service 18911 1727096318.68631: done queuing things up, now waiting for results queue to drain 18911 1727096318.68633: waiting for pending results... 18911 1727096318.68818: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18911 1727096318.68899: in run() - task 0afff68d-5257-09a7-aae1-000000000068 18911 1727096318.68909: variable 'ansible_search_path' from source: unknown 18911 1727096318.68912: variable 'ansible_search_path' from source: unknown 18911 1727096318.68947: calling self._execute() 18911 1727096318.69026: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.69030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.69039: variable 'omit' from source: magic vars 18911 1727096318.69329: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.69339: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096318.69477: variable 'network_provider' from source: set_fact 18911 1727096318.69481: variable 'network_state' from source: role '' defaults 18911 1727096318.69490: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18911 1727096318.69505: variable 'omit' from source: magic vars 18911 1727096318.69544: variable 'omit' from source: magic vars 18911 1727096318.69577: variable 'network_service_name' from source: role '' defaults 18911 1727096318.69636: variable 'network_service_name' from source: role '' defaults 18911 1727096318.69717: variable '__network_provider_setup' from source: role '' defaults 18911 1727096318.69729: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096318.69791: variable '__network_service_name_default_nm' from source: role '' defaults 18911 1727096318.69799: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096318.69859: variable '__network_packages_default_nm' from source: role '' defaults 18911 1727096318.70020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096318.71786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096318.71866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096318.71907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096318.71938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096318.71965: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096318.72047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.72087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.72121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.72153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.72170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.72229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.72260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.72265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.72319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.72323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.72561: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18911 1727096318.72697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.72700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.72731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.72769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.72784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.72872: variable 'ansible_python' from source: facts 18911 1727096318.72888: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18911 1727096318.72990: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096318.73062: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096318.73179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.73192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.73220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.73250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.73261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.73298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096318.73330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096318.73350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.73378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096318.73389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096318.73483: variable 'network_connections' from source: play vars 18911 1727096318.73493: variable 'profile' from source: play vars 18911 1727096318.73552: variable 'profile' from source: play vars 18911 1727096318.73557: variable 'interface' from source: set_fact 18911 1727096318.73603: variable 'interface' from source: set_fact 18911 1727096318.73677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096318.73886: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096318.73944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096318.73976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096318.74021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096318.74075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096318.74103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096318.74136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096318.74159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096318.74205: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.74461: variable 'network_connections' from source: play vars 18911 1727096318.74469: variable 'profile' from source: play vars 18911 1727096318.74541: variable 'profile' from source: play vars 18911 1727096318.74544: variable 'interface' from source: set_fact 18911 1727096318.74599: variable 'interface' from source: set_fact 18911 1727096318.74649: variable '__network_packages_default_wireless' from source: role '' defaults 18911 1727096318.74705: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096318.74908: variable 'network_connections' from source: play vars 18911 1727096318.74911: variable 'profile' from source: play vars 18911 1727096318.74972: variable 'profile' from source: play vars 18911 1727096318.74975: variable 'interface' from source: set_fact 18911 1727096318.75042: variable 'interface' from source: set_fact 18911 1727096318.75067: variable '__network_packages_default_team' from source: role '' defaults 18911 1727096318.75122: variable '__network_team_connections_defined' from source: role '' defaults 18911 1727096318.75348: variable 'network_connections' from source: play vars 18911 1727096318.75352: variable 'profile' from source: play vars 18911 1727096318.75406: variable 'profile' from source: play vars 18911 1727096318.75409: variable 'interface' from source: set_fact 18911 1727096318.75470: variable 'interface' from source: set_fact 18911 1727096318.75506: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096318.75547: variable '__network_service_name_default_initscripts' from source: role '' defaults 18911 1727096318.75556: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096318.75602: variable '__network_packages_default_initscripts' from source: role '' defaults 18911 1727096318.75751: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18911 1727096318.76150: variable 'network_connections' from source: play vars 18911 1727096318.76153: variable 'profile' from source: play vars 18911 1727096318.76197: variable 'profile' from source: play vars 18911 1727096318.76200: variable 'interface' from source: set_fact 18911 1727096318.76251: variable 'interface' from source: set_fact 18911 1727096318.76259: variable 'ansible_distribution' from source: facts 18911 1727096318.76261: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.76270: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.76283: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18911 1727096318.76397: variable 'ansible_distribution' from source: facts 18911 1727096318.76400: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.76405: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.76448: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18911 1727096318.76600: variable 'ansible_distribution' from source: facts 18911 1727096318.76604: variable '__network_rh_distros' from source: role '' defaults 18911 1727096318.76607: variable 'ansible_distribution_major_version' from source: facts 18911 1727096318.76637: variable 'network_provider' from source: set_fact 18911 1727096318.76656: variable 'omit' from source: magic vars 18911 1727096318.76684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096318.76707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096318.76722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096318.76737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096318.76745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096318.76771: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096318.76774: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.76777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.76849: Set connection var ansible_shell_executable to /bin/sh 18911 1727096318.76854: Set connection var ansible_timeout to 10 18911 1727096318.76856: Set connection var ansible_shell_type to sh 18911 1727096318.76865: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096318.76869: Set connection var ansible_pipelining to False 18911 1727096318.76874: Set connection var ansible_connection to ssh 18911 1727096318.76893: variable 'ansible_shell_executable' from source: unknown 18911 1727096318.76897: variable 'ansible_connection' from source: unknown 18911 1727096318.76900: variable 'ansible_module_compression' from source: unknown 18911 1727096318.76902: variable 'ansible_shell_type' from source: unknown 18911 1727096318.76906: variable 'ansible_shell_executable' from source: unknown 18911 1727096318.76908: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096318.76915: variable 'ansible_pipelining' from source: unknown 18911 1727096318.76917: variable 'ansible_timeout' from source: unknown 18911 1727096318.76919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096318.76995: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096318.77004: variable 'omit' from source: magic vars 18911 1727096318.77010: starting attempt loop 18911 1727096318.77012: running the handler 18911 1727096318.77072: variable 'ansible_facts' from source: unknown 18911 1727096318.77559: _low_level_execute_command(): starting 18911 1727096318.77566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096318.78062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096318.78071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096318.78100: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096318.78103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096318.78106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096318.78108: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.78155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096318.78158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096318.78178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096318.78260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096318.79977: stdout chunk (state=3): >>>/root <<< 18911 1727096318.80065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096318.80105: stderr chunk (state=3): >>><<< 18911 1727096318.80108: stdout chunk (state=3): >>><<< 18911 1727096318.80126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096318.80137: _low_level_execute_command(): starting 18911 1727096318.80143: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618 `" && echo ansible-tmp-1727096318.8012712-20696-214414428953618="` echo /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618 `" ) && sleep 0' 18911 1727096318.80608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096318.80611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096318.80615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.80619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096318.80622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.80684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096318.80688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096318.80795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096318.82713: stdout chunk (state=3): >>>ansible-tmp-1727096318.8012712-20696-214414428953618=/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618 <<< 18911 1727096318.82829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096318.82854: stderr chunk (state=3): >>><<< 18911 1727096318.82864: stdout chunk (state=3): >>><<< 18911 1727096318.82883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096318.8012712-20696-214414428953618=/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096318.82951: variable 'ansible_module_compression' from source: unknown 18911 1727096318.83003: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18911 1727096318.83068: variable 'ansible_facts' from source: unknown 18911 1727096318.83213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py 18911 1727096318.83332: Sending initial data 18911 1727096318.83341: Sent initial data (156 bytes) 18911 1727096318.83979: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.84020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096318.84023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096318.84028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096318.84103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096318.85684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096318.85745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096318.85811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpogek0wqx /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py <<< 18911 1727096318.85817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py" <<< 18911 1727096318.85883: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpogek0wqx" to remote "/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py" <<< 18911 1727096318.85885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py" <<< 18911 1727096318.87189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096318.87234: stderr chunk (state=3): >>><<< 18911 1727096318.87238: stdout chunk (state=3): >>><<< 18911 1727096318.87265: done transferring module to remote 18911 1727096318.87277: _low_level_execute_command(): starting 18911 1727096318.87282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/ /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py && sleep 0' 18911 1727096318.87800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096318.87803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.87806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096318.87808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096318.87810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.87858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096318.87865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096318.87869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096318.87944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096318.89798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096318.89823: stderr chunk (state=3): >>><<< 18911 1727096318.89827: stdout chunk (state=3): >>><<< 18911 1727096318.89841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096318.89845: _low_level_execute_command(): starting 18911 1727096318.89850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/AnsiballZ_systemd.py && sleep 0' 18911 1727096318.90343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096318.90346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.90349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096318.90356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096318.90358: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096318.90405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096318.90410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096318.90420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096318.90511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.19897: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306291200", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "920283000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 18911 1727096319.19910: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.<<< 18911 1727096319.19919: stdout chunk (state=3): >>>slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18911 1727096319.21844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096319.21872: stderr chunk (state=3): >>><<< 18911 1727096319.21875: stdout chunk (state=3): >>><<< 18911 1727096319.21889: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306291200", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "920283000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096319.22009: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096319.22025: _low_level_execute_command(): starting 18911 1727096319.22029: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096318.8012712-20696-214414428953618/ > /dev/null 2>&1 && sleep 0' 18911 1727096319.22503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096319.22506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096319.22509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096319.22513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096319.22515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096319.22561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096319.22564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.22566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.22645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.24514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096319.24539: stderr chunk (state=3): >>><<< 18911 1727096319.24542: stdout chunk (state=3): >>><<< 18911 1727096319.24557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096319.24563: handler run complete 18911 1727096319.24612: attempt loop complete, returning result 18911 1727096319.24772: _execute() done 18911 1727096319.24776: dumping result to json 18911 1727096319.24778: done dumping result, returning 18911 1727096319.24780: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-09a7-aae1-000000000068] 18911 1727096319.24782: sending task result for task 0afff68d-5257-09a7-aae1-000000000068 18911 1727096319.25086: done sending task result for task 0afff68d-5257-09a7-aae1-000000000068 18911 1727096319.25089: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096319.25135: no more pending results, returning what we have 18911 1727096319.25138: results queue empty 18911 1727096319.25139: checking for any_errors_fatal 18911 1727096319.25145: done checking for any_errors_fatal 18911 1727096319.25146: checking for max_fail_percentage 18911 1727096319.25147: done checking for max_fail_percentage 18911 1727096319.25148: checking to see if all hosts have failed and the running result is not ok 18911 1727096319.25149: done checking to see if all hosts have failed 18911 1727096319.25149: getting the remaining hosts for this loop 18911 1727096319.25150: done getting the remaining hosts for this loop 18911 1727096319.25153: getting the next task for host managed_node1 18911 1727096319.25159: done getting next task for host managed_node1 18911 1727096319.25162: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096319.25164: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096319.25175: getting variables 18911 1727096319.25177: in VariableManager get_vars() 18911 1727096319.25242: Calling all_inventory to load vars for managed_node1 18911 1727096319.25245: Calling groups_inventory to load vars for managed_node1 18911 1727096319.25248: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096319.25257: Calling all_plugins_play to load vars for managed_node1 18911 1727096319.25259: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096319.25261: Calling groups_plugins_play to load vars for managed_node1 18911 1727096319.27187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096319.28578: done with get_vars() 18911 1727096319.28598: done getting variables 18911 1727096319.28645: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:58:39 -0400 (0:00:00.603) 0:00:38.401 ****** 18911 1727096319.28672: entering _queue_task() for managed_node1/service 18911 1727096319.28940: worker is 1 (out of 1 available) 18911 1727096319.28952: exiting _queue_task() for managed_node1/service 18911 1727096319.28969: done queuing things up, now waiting for results queue to drain 18911 1727096319.28971: waiting for pending results... 18911 1727096319.29188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18911 1727096319.29239: in run() - task 0afff68d-5257-09a7-aae1-000000000069 18911 1727096319.29251: variable 'ansible_search_path' from source: unknown 18911 1727096319.29254: variable 'ansible_search_path' from source: unknown 18911 1727096319.29286: calling self._execute() 18911 1727096319.29366: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.29372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.29380: variable 'omit' from source: magic vars 18911 1727096319.29671: variable 'ansible_distribution_major_version' from source: facts 18911 1727096319.29681: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096319.29760: variable 'network_provider' from source: set_fact 18911 1727096319.29778: Evaluated conditional (network_provider == "nm"): True 18911 1727096319.30094: variable '__network_wpa_supplicant_required' from source: role '' defaults 18911 1727096319.30098: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18911 1727096319.30325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096319.32484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096319.32560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096319.32604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096319.32641: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096319.32675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096319.32976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096319.32981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096319.32984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096319.32986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096319.32989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096319.32991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096319.32993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096319.33029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096319.33084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096319.33088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096319.33125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096319.33150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096319.33183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096319.33227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096319.33231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096319.33380: variable 'network_connections' from source: play vars 18911 1727096319.33393: variable 'profile' from source: play vars 18911 1727096319.33500: variable 'profile' from source: play vars 18911 1727096319.33506: variable 'interface' from source: set_fact 18911 1727096319.33566: variable 'interface' from source: set_fact 18911 1727096319.33675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18911 1727096319.34004: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18911 1727096319.34038: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18911 1727096319.34066: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18911 1727096319.34101: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18911 1727096319.34195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18911 1727096319.34198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18911 1727096319.34201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096319.34285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18911 1727096319.34290: variable '__network_wireless_connections_defined' from source: role '' defaults 18911 1727096319.34674: variable 'network_connections' from source: play vars 18911 1727096319.34677: variable 'profile' from source: play vars 18911 1727096319.34680: variable 'profile' from source: play vars 18911 1727096319.34682: variable 'interface' from source: set_fact 18911 1727096319.34701: variable 'interface' from source: set_fact 18911 1727096319.34722: Evaluated conditional (__network_wpa_supplicant_required): False 18911 1727096319.34725: when evaluation is False, skipping this task 18911 1727096319.34728: _execute() done 18911 1727096319.34738: dumping result to json 18911 1727096319.34741: done dumping result, returning 18911 1727096319.34743: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-09a7-aae1-000000000069] 18911 1727096319.34746: sending task result for task 0afff68d-5257-09a7-aae1-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18911 1727096319.35062: no more pending results, returning what we have 18911 1727096319.35068: results queue empty 18911 1727096319.35070: checking for any_errors_fatal 18911 1727096319.35089: done checking for any_errors_fatal 18911 1727096319.35090: checking for max_fail_percentage 18911 1727096319.35092: done checking for max_fail_percentage 18911 1727096319.35093: checking to see if all hosts have failed and the running result is not ok 18911 1727096319.35093: done checking to see if all hosts have failed 18911 1727096319.35094: getting the remaining hosts for this loop 18911 1727096319.35095: done getting the remaining hosts for this loop 18911 1727096319.35098: getting the next task for host managed_node1 18911 1727096319.35103: done getting next task for host managed_node1 18911 1727096319.35107: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096319.35109: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096319.35121: getting variables 18911 1727096319.35122: in VariableManager get_vars() 18911 1727096319.35158: Calling all_inventory to load vars for managed_node1 18911 1727096319.35160: Calling groups_inventory to load vars for managed_node1 18911 1727096319.35162: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096319.35280: Calling all_plugins_play to load vars for managed_node1 18911 1727096319.35284: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096319.35376: Calling groups_plugins_play to load vars for managed_node1 18911 1727096319.35983: done sending task result for task 0afff68d-5257-09a7-aae1-000000000069 18911 1727096319.35986: WORKER PROCESS EXITING 18911 1727096319.38033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096319.40827: done with get_vars() 18911 1727096319.40866: done getting variables 18911 1727096319.40945: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:58:39 -0400 (0:00:00.123) 0:00:38.524 ****** 18911 1727096319.40992: entering _queue_task() for managed_node1/service 18911 1727096319.41407: worker is 1 (out of 1 available) 18911 1727096319.41422: exiting _queue_task() for managed_node1/service 18911 1727096319.41435: done queuing things up, now waiting for results queue to drain 18911 1727096319.41437: waiting for pending results... 18911 1727096319.41746: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18911 1727096319.41899: in run() - task 0afff68d-5257-09a7-aae1-00000000006a 18911 1727096319.41922: variable 'ansible_search_path' from source: unknown 18911 1727096319.41931: variable 'ansible_search_path' from source: unknown 18911 1727096319.41979: calling self._execute() 18911 1727096319.42091: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.42108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.42135: variable 'omit' from source: magic vars 18911 1727096319.42601: variable 'ansible_distribution_major_version' from source: facts 18911 1727096319.42619: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096319.42778: variable 'network_provider' from source: set_fact 18911 1727096319.42790: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096319.42797: when evaluation is False, skipping this task 18911 1727096319.42804: _execute() done 18911 1727096319.42812: dumping result to json 18911 1727096319.42876: done dumping result, returning 18911 1727096319.42879: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-09a7-aae1-00000000006a] 18911 1727096319.42882: sending task result for task 0afff68d-5257-09a7-aae1-00000000006a 18911 1727096319.42951: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006a 18911 1727096319.42955: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18911 1727096319.43025: no more pending results, returning what we have 18911 1727096319.43028: results queue empty 18911 1727096319.43030: checking for any_errors_fatal 18911 1727096319.43041: done checking for any_errors_fatal 18911 1727096319.43042: checking for max_fail_percentage 18911 1727096319.43044: done checking for max_fail_percentage 18911 1727096319.43045: checking to see if all hosts have failed and the running result is not ok 18911 1727096319.43046: done checking to see if all hosts have failed 18911 1727096319.43046: getting the remaining hosts for this loop 18911 1727096319.43048: done getting the remaining hosts for this loop 18911 1727096319.43051: getting the next task for host managed_node1 18911 1727096319.43058: done getting next task for host managed_node1 18911 1727096319.43063: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096319.43070: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096319.43273: getting variables 18911 1727096319.43275: in VariableManager get_vars() 18911 1727096319.43315: Calling all_inventory to load vars for managed_node1 18911 1727096319.43318: Calling groups_inventory to load vars for managed_node1 18911 1727096319.43321: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096319.43333: Calling all_plugins_play to load vars for managed_node1 18911 1727096319.43336: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096319.43339: Calling groups_plugins_play to load vars for managed_node1 18911 1727096319.45785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096319.46897: done with get_vars() 18911 1727096319.46926: done getting variables 18911 1727096319.46992: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:58:39 -0400 (0:00:00.060) 0:00:38.584 ****** 18911 1727096319.47025: entering _queue_task() for managed_node1/copy 18911 1727096319.47410: worker is 1 (out of 1 available) 18911 1727096319.47425: exiting _queue_task() for managed_node1/copy 18911 1727096319.47437: done queuing things up, now waiting for results queue to drain 18911 1727096319.47440: waiting for pending results... 18911 1727096319.47790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18911 1727096319.47878: in run() - task 0afff68d-5257-09a7-aae1-00000000006b 18911 1727096319.47891: variable 'ansible_search_path' from source: unknown 18911 1727096319.47895: variable 'ansible_search_path' from source: unknown 18911 1727096319.47982: calling self._execute() 18911 1727096319.48052: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.48142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.48146: variable 'omit' from source: magic vars 18911 1727096319.48574: variable 'ansible_distribution_major_version' from source: facts 18911 1727096319.48578: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096319.48657: variable 'network_provider' from source: set_fact 18911 1727096319.48661: Evaluated conditional (network_provider == "initscripts"): False 18911 1727096319.48666: when evaluation is False, skipping this task 18911 1727096319.48677: _execute() done 18911 1727096319.48681: dumping result to json 18911 1727096319.48683: done dumping result, returning 18911 1727096319.48687: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-09a7-aae1-00000000006b] 18911 1727096319.48689: sending task result for task 0afff68d-5257-09a7-aae1-00000000006b 18911 1727096319.48801: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006b 18911 1727096319.48804: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18911 1727096319.48876: no more pending results, returning what we have 18911 1727096319.48881: results queue empty 18911 1727096319.48882: checking for any_errors_fatal 18911 1727096319.48888: done checking for any_errors_fatal 18911 1727096319.48889: checking for max_fail_percentage 18911 1727096319.48891: done checking for max_fail_percentage 18911 1727096319.48892: checking to see if all hosts have failed and the running result is not ok 18911 1727096319.48893: done checking to see if all hosts have failed 18911 1727096319.48893: getting the remaining hosts for this loop 18911 1727096319.48895: done getting the remaining hosts for this loop 18911 1727096319.48899: getting the next task for host managed_node1 18911 1727096319.48906: done getting next task for host managed_node1 18911 1727096319.48911: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096319.48913: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096319.49055: getting variables 18911 1727096319.49058: in VariableManager get_vars() 18911 1727096319.49104: Calling all_inventory to load vars for managed_node1 18911 1727096319.49107: Calling groups_inventory to load vars for managed_node1 18911 1727096319.49109: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096319.49120: Calling all_plugins_play to load vars for managed_node1 18911 1727096319.49123: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096319.49125: Calling groups_plugins_play to load vars for managed_node1 18911 1727096319.50054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096319.51673: done with get_vars() 18911 1727096319.51705: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:58:39 -0400 (0:00:00.047) 0:00:38.632 ****** 18911 1727096319.51807: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096319.52198: worker is 1 (out of 1 available) 18911 1727096319.52210: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18911 1727096319.52222: done queuing things up, now waiting for results queue to drain 18911 1727096319.52224: waiting for pending results... 18911 1727096319.52551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18911 1727096319.52734: in run() - task 0afff68d-5257-09a7-aae1-00000000006c 18911 1727096319.52738: variable 'ansible_search_path' from source: unknown 18911 1727096319.52740: variable 'ansible_search_path' from source: unknown 18911 1727096319.52743: calling self._execute() 18911 1727096319.52772: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.52777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.52790: variable 'omit' from source: magic vars 18911 1727096319.53540: variable 'ansible_distribution_major_version' from source: facts 18911 1727096319.53551: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096319.53557: variable 'omit' from source: magic vars 18911 1727096319.53712: variable 'omit' from source: magic vars 18911 1727096319.53972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18911 1727096319.57575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18911 1727096319.57580: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18911 1727096319.57712: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18911 1727096319.57742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18911 1727096319.57772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18911 1727096319.57965: variable 'network_provider' from source: set_fact 18911 1727096319.58181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18911 1727096319.58209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18911 1727096319.58329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18911 1727096319.58402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18911 1727096319.58419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18911 1727096319.58593: variable 'omit' from source: magic vars 18911 1727096319.58710: variable 'omit' from source: magic vars 18911 1727096319.58836: variable 'network_connections' from source: play vars 18911 1727096319.58846: variable 'profile' from source: play vars 18911 1727096319.58975: variable 'profile' from source: play vars 18911 1727096319.58979: variable 'interface' from source: set_fact 18911 1727096319.58981: variable 'interface' from source: set_fact 18911 1727096319.59270: variable 'omit' from source: magic vars 18911 1727096319.59274: variable '__lsr_ansible_managed' from source: task vars 18911 1727096319.59448: variable '__lsr_ansible_managed' from source: task vars 18911 1727096319.59885: Loaded config def from plugin (lookup/template) 18911 1727096319.59888: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18911 1727096319.59890: File lookup term: get_ansible_managed.j2 18911 1727096319.60179: variable 'ansible_search_path' from source: unknown 18911 1727096319.60182: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18911 1727096319.60186: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18911 1727096319.60191: variable 'ansible_search_path' from source: unknown 18911 1727096319.75219: variable 'ansible_managed' from source: unknown 18911 1727096319.75356: variable 'omit' from source: magic vars 18911 1727096319.75383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096319.75413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096319.75428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096319.75443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096319.75453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096319.75474: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096319.75478: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.75481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.75578: Set connection var ansible_shell_executable to /bin/sh 18911 1727096319.75584: Set connection var ansible_timeout to 10 18911 1727096319.75586: Set connection var ansible_shell_type to sh 18911 1727096319.75672: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096319.75677: Set connection var ansible_pipelining to False 18911 1727096319.75679: Set connection var ansible_connection to ssh 18911 1727096319.75681: variable 'ansible_shell_executable' from source: unknown 18911 1727096319.75683: variable 'ansible_connection' from source: unknown 18911 1727096319.75685: variable 'ansible_module_compression' from source: unknown 18911 1727096319.75687: variable 'ansible_shell_type' from source: unknown 18911 1727096319.75689: variable 'ansible_shell_executable' from source: unknown 18911 1727096319.75691: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096319.75693: variable 'ansible_pipelining' from source: unknown 18911 1727096319.75695: variable 'ansible_timeout' from source: unknown 18911 1727096319.75697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096319.75813: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096319.75827: variable 'omit' from source: magic vars 18911 1727096319.75830: starting attempt loop 18911 1727096319.75832: running the handler 18911 1727096319.75834: _low_level_execute_command(): starting 18911 1727096319.75836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096319.76575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.76622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.76703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.78443: stdout chunk (state=3): >>>/root <<< 18911 1727096319.78601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096319.78605: stdout chunk (state=3): >>><<< 18911 1727096319.78607: stderr chunk (state=3): >>><<< 18911 1727096319.78626: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096319.78645: _low_level_execute_command(): starting 18911 1727096319.78657: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252 `" && echo ansible-tmp-1727096319.7863255-20722-247423361590252="` echo /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252 `" ) && sleep 0' 18911 1727096319.79304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096319.79328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096319.79345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096319.79365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096319.79392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096319.79436: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096319.79510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096319.79537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.79559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.79674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.81616: stdout chunk (state=3): >>>ansible-tmp-1727096319.7863255-20722-247423361590252=/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252 <<< 18911 1727096319.81788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096319.81792: stdout chunk (state=3): >>><<< 18911 1727096319.81794: stderr chunk (state=3): >>><<< 18911 1727096319.81810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096319.7863255-20722-247423361590252=/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096319.81860: variable 'ansible_module_compression' from source: unknown 18911 1727096319.81980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18911 1727096319.81983: variable 'ansible_facts' from source: unknown 18911 1727096319.82124: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py 18911 1727096319.82294: Sending initial data 18911 1727096319.82297: Sent initial data (168 bytes) 18911 1727096319.83011: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.83040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.83140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.84756: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096319.84841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096319.84920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpja1h_o4v /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py <<< 18911 1727096319.84923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py" <<< 18911 1727096319.84980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpja1h_o4v" to remote "/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py" <<< 18911 1727096319.85791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096319.85830: stderr chunk (state=3): >>><<< 18911 1727096319.85834: stdout chunk (state=3): >>><<< 18911 1727096319.85853: done transferring module to remote 18911 1727096319.85861: _low_level_execute_command(): starting 18911 1727096319.85870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/ /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py && sleep 0' 18911 1727096319.86549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096319.86565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.86602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.86691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096319.88572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096319.88576: stderr chunk (state=3): >>><<< 18911 1727096319.88579: stdout chunk (state=3): >>><<< 18911 1727096319.88594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096319.88598: _low_level_execute_command(): starting 18911 1727096319.88600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/AnsiballZ_network_connections.py && sleep 0' 18911 1727096319.89032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096319.89036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096319.89040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096319.89042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096319.89097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096319.89103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096319.89109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096319.89174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.16188: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0jaipgqn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0jaipgqn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/310bfb25-4f10-4235-b65a-8e010daf3c53: error=unknown <<< 18911 1727096320.16320: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18911 1727096320.18275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096320.18278: stdout chunk (state=3): >>><<< 18911 1727096320.18281: stderr chunk (state=3): >>><<< 18911 1727096320.18283: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0jaipgqn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0jaipgqn/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/310bfb25-4f10-4235-b65a-8e010daf3c53: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096320.18286: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096320.18288: _low_level_execute_command(): starting 18911 1727096320.18291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096319.7863255-20722-247423361590252/ > /dev/null 2>&1 && sleep 0' 18911 1727096320.19350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096320.19435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.19539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096320.19556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096320.19604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.19738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.21826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096320.21877: stderr chunk (state=3): >>><<< 18911 1727096320.21880: stdout chunk (state=3): >>><<< 18911 1727096320.21988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096320.21991: handler run complete 18911 1727096320.22035: attempt loop complete, returning result 18911 1727096320.22378: _execute() done 18911 1727096320.22382: dumping result to json 18911 1727096320.22384: done dumping result, returning 18911 1727096320.22386: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-09a7-aae1-00000000006c] 18911 1727096320.22388: sending task result for task 0afff68d-5257-09a7-aae1-00000000006c 18911 1727096320.22594: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006c changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18911 1727096320.22693: no more pending results, returning what we have 18911 1727096320.22696: results queue empty 18911 1727096320.22697: checking for any_errors_fatal 18911 1727096320.22704: done checking for any_errors_fatal 18911 1727096320.22705: checking for max_fail_percentage 18911 1727096320.22706: done checking for max_fail_percentage 18911 1727096320.22707: checking to see if all hosts have failed and the running result is not ok 18911 1727096320.22708: done checking to see if all hosts have failed 18911 1727096320.22708: getting the remaining hosts for this loop 18911 1727096320.22710: done getting the remaining hosts for this loop 18911 1727096320.22713: getting the next task for host managed_node1 18911 1727096320.22719: done getting next task for host managed_node1 18911 1727096320.22722: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096320.22724: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096320.22732: getting variables 18911 1727096320.22734: in VariableManager get_vars() 18911 1727096320.22902: Calling all_inventory to load vars for managed_node1 18911 1727096320.22905: Calling groups_inventory to load vars for managed_node1 18911 1727096320.22908: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096320.22920: Calling all_plugins_play to load vars for managed_node1 18911 1727096320.22923: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096320.22927: Calling groups_plugins_play to load vars for managed_node1 18911 1727096320.23452: WORKER PROCESS EXITING 18911 1727096320.24666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096320.26226: done with get_vars() 18911 1727096320.26258: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:58:40 -0400 (0:00:00.745) 0:00:39.377 ****** 18911 1727096320.26346: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096320.26971: worker is 1 (out of 1 available) 18911 1727096320.26979: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18911 1727096320.26990: done queuing things up, now waiting for results queue to drain 18911 1727096320.26991: waiting for pending results... 18911 1727096320.27119: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18911 1727096320.27213: in run() - task 0afff68d-5257-09a7-aae1-00000000006d 18911 1727096320.27238: variable 'ansible_search_path' from source: unknown 18911 1727096320.27247: variable 'ansible_search_path' from source: unknown 18911 1727096320.27301: calling self._execute() 18911 1727096320.27403: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.27429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.27512: variable 'omit' from source: magic vars 18911 1727096320.27928: variable 'ansible_distribution_major_version' from source: facts 18911 1727096320.27958: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096320.28175: variable 'network_state' from source: role '' defaults 18911 1727096320.28178: Evaluated conditional (network_state != {}): False 18911 1727096320.28181: when evaluation is False, skipping this task 18911 1727096320.28183: _execute() done 18911 1727096320.28186: dumping result to json 18911 1727096320.28283: done dumping result, returning 18911 1727096320.28287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-09a7-aae1-00000000006d] 18911 1727096320.28289: sending task result for task 0afff68d-5257-09a7-aae1-00000000006d 18911 1727096320.28349: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006d 18911 1727096320.28352: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18911 1727096320.28433: no more pending results, returning what we have 18911 1727096320.28436: results queue empty 18911 1727096320.28437: checking for any_errors_fatal 18911 1727096320.28450: done checking for any_errors_fatal 18911 1727096320.28451: checking for max_fail_percentage 18911 1727096320.28452: done checking for max_fail_percentage 18911 1727096320.28453: checking to see if all hosts have failed and the running result is not ok 18911 1727096320.28453: done checking to see if all hosts have failed 18911 1727096320.28454: getting the remaining hosts for this loop 18911 1727096320.28456: done getting the remaining hosts for this loop 18911 1727096320.28459: getting the next task for host managed_node1 18911 1727096320.28466: done getting next task for host managed_node1 18911 1727096320.28471: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096320.28474: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096320.28489: getting variables 18911 1727096320.28491: in VariableManager get_vars() 18911 1727096320.28526: Calling all_inventory to load vars for managed_node1 18911 1727096320.28529: Calling groups_inventory to load vars for managed_node1 18911 1727096320.28531: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096320.28543: Calling all_plugins_play to load vars for managed_node1 18911 1727096320.28545: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096320.28548: Calling groups_plugins_play to load vars for managed_node1 18911 1727096320.30970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096320.33665: done with get_vars() 18911 1727096320.33700: done getting variables 18911 1727096320.33763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:58:40 -0400 (0:00:00.074) 0:00:39.452 ****** 18911 1727096320.33797: entering _queue_task() for managed_node1/debug 18911 1727096320.34143: worker is 1 (out of 1 available) 18911 1727096320.34157: exiting _queue_task() for managed_node1/debug 18911 1727096320.34171: done queuing things up, now waiting for results queue to drain 18911 1727096320.34173: waiting for pending results... 18911 1727096320.34587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18911 1727096320.34593: in run() - task 0afff68d-5257-09a7-aae1-00000000006e 18911 1727096320.34598: variable 'ansible_search_path' from source: unknown 18911 1727096320.34602: variable 'ansible_search_path' from source: unknown 18911 1727096320.34641: calling self._execute() 18911 1727096320.34750: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.34753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.34765: variable 'omit' from source: magic vars 18911 1727096320.35406: variable 'ansible_distribution_major_version' from source: facts 18911 1727096320.35422: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096320.35432: variable 'omit' from source: magic vars 18911 1727096320.35479: variable 'omit' from source: magic vars 18911 1727096320.35574: variable 'omit' from source: magic vars 18911 1727096320.35577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096320.35607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096320.35632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096320.35653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.35667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.35705: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096320.35714: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.35721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.35829: Set connection var ansible_shell_executable to /bin/sh 18911 1727096320.35840: Set connection var ansible_timeout to 10 18911 1727096320.35847: Set connection var ansible_shell_type to sh 18911 1727096320.35897: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096320.35900: Set connection var ansible_pipelining to False 18911 1727096320.35903: Set connection var ansible_connection to ssh 18911 1727096320.35904: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.35906: variable 'ansible_connection' from source: unknown 18911 1727096320.35912: variable 'ansible_module_compression' from source: unknown 18911 1727096320.35919: variable 'ansible_shell_type' from source: unknown 18911 1727096320.35925: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.35931: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.35939: variable 'ansible_pipelining' from source: unknown 18911 1727096320.35945: variable 'ansible_timeout' from source: unknown 18911 1727096320.35952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.36115: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096320.36118: variable 'omit' from source: magic vars 18911 1727096320.36121: starting attempt loop 18911 1727096320.36123: running the handler 18911 1727096320.36252: variable '__network_connections_result' from source: set_fact 18911 1727096320.36309: handler run complete 18911 1727096320.36438: attempt loop complete, returning result 18911 1727096320.36442: _execute() done 18911 1727096320.36445: dumping result to json 18911 1727096320.36447: done dumping result, returning 18911 1727096320.36450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-09a7-aae1-00000000006e] 18911 1727096320.36452: sending task result for task 0afff68d-5257-09a7-aae1-00000000006e 18911 1727096320.36524: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006e 18911 1727096320.36528: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18911 1727096320.36594: no more pending results, returning what we have 18911 1727096320.36598: results queue empty 18911 1727096320.36599: checking for any_errors_fatal 18911 1727096320.36605: done checking for any_errors_fatal 18911 1727096320.36606: checking for max_fail_percentage 18911 1727096320.36608: done checking for max_fail_percentage 18911 1727096320.36608: checking to see if all hosts have failed and the running result is not ok 18911 1727096320.36609: done checking to see if all hosts have failed 18911 1727096320.36610: getting the remaining hosts for this loop 18911 1727096320.36611: done getting the remaining hosts for this loop 18911 1727096320.36615: getting the next task for host managed_node1 18911 1727096320.36622: done getting next task for host managed_node1 18911 1727096320.36626: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096320.36628: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096320.36637: getting variables 18911 1727096320.36640: in VariableManager get_vars() 18911 1727096320.36678: Calling all_inventory to load vars for managed_node1 18911 1727096320.36681: Calling groups_inventory to load vars for managed_node1 18911 1727096320.36684: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096320.36695: Calling all_plugins_play to load vars for managed_node1 18911 1727096320.36698: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096320.36701: Calling groups_plugins_play to load vars for managed_node1 18911 1727096320.45306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096320.47405: done with get_vars() 18911 1727096320.47435: done getting variables 18911 1727096320.47512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:58:40 -0400 (0:00:00.137) 0:00:39.589 ****** 18911 1727096320.47540: entering _queue_task() for managed_node1/debug 18911 1727096320.47902: worker is 1 (out of 1 available) 18911 1727096320.48076: exiting _queue_task() for managed_node1/debug 18911 1727096320.48092: done queuing things up, now waiting for results queue to drain 18911 1727096320.48095: waiting for pending results... 18911 1727096320.48386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18911 1727096320.48413: in run() - task 0afff68d-5257-09a7-aae1-00000000006f 18911 1727096320.48434: variable 'ansible_search_path' from source: unknown 18911 1727096320.48503: variable 'ansible_search_path' from source: unknown 18911 1727096320.48506: calling self._execute() 18911 1727096320.48615: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.48630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.48647: variable 'omit' from source: magic vars 18911 1727096320.49073: variable 'ansible_distribution_major_version' from source: facts 18911 1727096320.49091: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096320.49102: variable 'omit' from source: magic vars 18911 1727096320.49174: variable 'omit' from source: magic vars 18911 1727096320.49219: variable 'omit' from source: magic vars 18911 1727096320.49289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096320.49375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096320.49380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096320.49387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.49403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.49442: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096320.49452: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.49461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.49577: Set connection var ansible_shell_executable to /bin/sh 18911 1727096320.49772: Set connection var ansible_timeout to 10 18911 1727096320.49776: Set connection var ansible_shell_type to sh 18911 1727096320.49778: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096320.49780: Set connection var ansible_pipelining to False 18911 1727096320.49782: Set connection var ansible_connection to ssh 18911 1727096320.49784: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.49786: variable 'ansible_connection' from source: unknown 18911 1727096320.49789: variable 'ansible_module_compression' from source: unknown 18911 1727096320.49791: variable 'ansible_shell_type' from source: unknown 18911 1727096320.49793: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.49795: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.49797: variable 'ansible_pipelining' from source: unknown 18911 1727096320.49799: variable 'ansible_timeout' from source: unknown 18911 1727096320.49801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.49866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096320.49886: variable 'omit' from source: magic vars 18911 1727096320.49897: starting attempt loop 18911 1727096320.49904: running the handler 18911 1727096320.49960: variable '__network_connections_result' from source: set_fact 18911 1727096320.50049: variable '__network_connections_result' from source: set_fact 18911 1727096320.50175: handler run complete 18911 1727096320.50205: attempt loop complete, returning result 18911 1727096320.50213: _execute() done 18911 1727096320.50222: dumping result to json 18911 1727096320.50231: done dumping result, returning 18911 1727096320.50277: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-09a7-aae1-00000000006f] 18911 1727096320.50280: sending task result for task 0afff68d-5257-09a7-aae1-00000000006f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18911 1727096320.50596: no more pending results, returning what we have 18911 1727096320.50600: results queue empty 18911 1727096320.50601: checking for any_errors_fatal 18911 1727096320.50611: done checking for any_errors_fatal 18911 1727096320.50612: checking for max_fail_percentage 18911 1727096320.50614: done checking for max_fail_percentage 18911 1727096320.50615: checking to see if all hosts have failed and the running result is not ok 18911 1727096320.50615: done checking to see if all hosts have failed 18911 1727096320.50616: getting the remaining hosts for this loop 18911 1727096320.50618: done getting the remaining hosts for this loop 18911 1727096320.50621: getting the next task for host managed_node1 18911 1727096320.50629: done getting next task for host managed_node1 18911 1727096320.50632: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096320.50634: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096320.50646: getting variables 18911 1727096320.50648: in VariableManager get_vars() 18911 1727096320.51125: Calling all_inventory to load vars for managed_node1 18911 1727096320.51128: Calling groups_inventory to load vars for managed_node1 18911 1727096320.51131: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096320.51141: Calling all_plugins_play to load vars for managed_node1 18911 1727096320.51143: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096320.51146: Calling groups_plugins_play to load vars for managed_node1 18911 1727096320.51882: done sending task result for task 0afff68d-5257-09a7-aae1-00000000006f 18911 1727096320.51885: WORKER PROCESS EXITING 18911 1727096320.53462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096320.55235: done with get_vars() 18911 1727096320.55258: done getting variables 18911 1727096320.55321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:58:40 -0400 (0:00:00.078) 0:00:39.667 ****** 18911 1727096320.55357: entering _queue_task() for managed_node1/debug 18911 1727096320.55702: worker is 1 (out of 1 available) 18911 1727096320.55716: exiting _queue_task() for managed_node1/debug 18911 1727096320.55727: done queuing things up, now waiting for results queue to drain 18911 1727096320.55728: waiting for pending results... 18911 1727096320.56017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18911 1727096320.56144: in run() - task 0afff68d-5257-09a7-aae1-000000000070 18911 1727096320.56162: variable 'ansible_search_path' from source: unknown 18911 1727096320.56175: variable 'ansible_search_path' from source: unknown 18911 1727096320.56219: calling self._execute() 18911 1727096320.56329: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.56341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.56356: variable 'omit' from source: magic vars 18911 1727096320.56751: variable 'ansible_distribution_major_version' from source: facts 18911 1727096320.56773: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096320.56909: variable 'network_state' from source: role '' defaults 18911 1727096320.56925: Evaluated conditional (network_state != {}): False 18911 1727096320.56931: when evaluation is False, skipping this task 18911 1727096320.56939: _execute() done 18911 1727096320.56949: dumping result to json 18911 1727096320.56957: done dumping result, returning 18911 1727096320.57176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-09a7-aae1-000000000070] 18911 1727096320.57179: sending task result for task 0afff68d-5257-09a7-aae1-000000000070 18911 1727096320.57249: done sending task result for task 0afff68d-5257-09a7-aae1-000000000070 18911 1727096320.57251: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18911 1727096320.57301: no more pending results, returning what we have 18911 1727096320.57305: results queue empty 18911 1727096320.57305: checking for any_errors_fatal 18911 1727096320.57315: done checking for any_errors_fatal 18911 1727096320.57316: checking for max_fail_percentage 18911 1727096320.57318: done checking for max_fail_percentage 18911 1727096320.57319: checking to see if all hosts have failed and the running result is not ok 18911 1727096320.57320: done checking to see if all hosts have failed 18911 1727096320.57321: getting the remaining hosts for this loop 18911 1727096320.57322: done getting the remaining hosts for this loop 18911 1727096320.57326: getting the next task for host managed_node1 18911 1727096320.57331: done getting next task for host managed_node1 18911 1727096320.57335: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096320.57338: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096320.57353: getting variables 18911 1727096320.57354: in VariableManager get_vars() 18911 1727096320.57395: Calling all_inventory to load vars for managed_node1 18911 1727096320.57398: Calling groups_inventory to load vars for managed_node1 18911 1727096320.57400: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096320.57411: Calling all_plugins_play to load vars for managed_node1 18911 1727096320.57415: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096320.57417: Calling groups_plugins_play to load vars for managed_node1 18911 1727096320.59517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096320.62860: done with get_vars() 18911 1727096320.63100: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:58:40 -0400 (0:00:00.078) 0:00:39.746 ****** 18911 1727096320.63201: entering _queue_task() for managed_node1/ping 18911 1727096320.63941: worker is 1 (out of 1 available) 18911 1727096320.64071: exiting _queue_task() for managed_node1/ping 18911 1727096320.64084: done queuing things up, now waiting for results queue to drain 18911 1727096320.64086: waiting for pending results... 18911 1727096320.64451: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18911 1727096320.65073: in run() - task 0afff68d-5257-09a7-aae1-000000000071 18911 1727096320.65078: variable 'ansible_search_path' from source: unknown 18911 1727096320.65081: variable 'ansible_search_path' from source: unknown 18911 1727096320.65083: calling self._execute() 18911 1727096320.65086: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.65088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.65091: variable 'omit' from source: magic vars 18911 1727096320.65825: variable 'ansible_distribution_major_version' from source: facts 18911 1727096320.65844: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096320.65855: variable 'omit' from source: magic vars 18911 1727096320.65915: variable 'omit' from source: magic vars 18911 1727096320.66010: variable 'omit' from source: magic vars 18911 1727096320.66273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096320.66277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096320.66291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096320.66315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.66333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096320.66405: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096320.66672: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.66676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.66692: Set connection var ansible_shell_executable to /bin/sh 18911 1727096320.66702: Set connection var ansible_timeout to 10 18911 1727096320.66709: Set connection var ansible_shell_type to sh 18911 1727096320.66719: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096320.66731: Set connection var ansible_pipelining to False 18911 1727096320.66740: Set connection var ansible_connection to ssh 18911 1727096320.66772: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.67072: variable 'ansible_connection' from source: unknown 18911 1727096320.67076: variable 'ansible_module_compression' from source: unknown 18911 1727096320.67078: variable 'ansible_shell_type' from source: unknown 18911 1727096320.67081: variable 'ansible_shell_executable' from source: unknown 18911 1727096320.67083: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096320.67085: variable 'ansible_pipelining' from source: unknown 18911 1727096320.67087: variable 'ansible_timeout' from source: unknown 18911 1727096320.67089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096320.67220: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096320.67572: variable 'omit' from source: magic vars 18911 1727096320.67575: starting attempt loop 18911 1727096320.67578: running the handler 18911 1727096320.67580: _low_level_execute_command(): starting 18911 1727096320.67583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096320.68989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096320.69004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.69166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096320.69181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096320.69210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.69310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.71028: stdout chunk (state=3): >>>/root <<< 18911 1727096320.71290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096320.71337: stderr chunk (state=3): >>><<< 18911 1727096320.71347: stdout chunk (state=3): >>><<< 18911 1727096320.71489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096320.71509: _low_level_execute_command(): starting 18911 1727096320.71520: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250 `" && echo ansible-tmp-1727096320.7149637-20765-152158712426250="` echo /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250 `" ) && sleep 0' 18911 1727096320.72615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096320.72901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096320.72926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.72979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.74958: stdout chunk (state=3): >>>ansible-tmp-1727096320.7149637-20765-152158712426250=/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250 <<< 18911 1727096320.75274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096320.75278: stdout chunk (state=3): >>><<< 18911 1727096320.75281: stderr chunk (state=3): >>><<< 18911 1727096320.75285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096320.7149637-20765-152158712426250=/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096320.75304: variable 'ansible_module_compression' from source: unknown 18911 1727096320.75348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18911 1727096320.75554: variable 'ansible_facts' from source: unknown 18911 1727096320.75772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py 18911 1727096320.76011: Sending initial data 18911 1727096320.76021: Sent initial data (153 bytes) 18911 1727096320.77082: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096320.77110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.77113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096320.77116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096320.77118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.77293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096320.77306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096320.77315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.77414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.79060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18911 1727096320.79065: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096320.79145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096320.79316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpja8wdype /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py <<< 18911 1727096320.79323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py" <<< 18911 1727096320.79347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpja8wdype" to remote "/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py" <<< 18911 1727096320.79502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py" <<< 18911 1727096320.80624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096320.80730: stderr chunk (state=3): >>><<< 18911 1727096320.80782: stdout chunk (state=3): >>><<< 18911 1727096320.80807: done transferring module to remote 18911 1727096320.80817: _low_level_execute_command(): starting 18911 1727096320.80850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/ /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py && sleep 0' 18911 1727096320.82102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096320.82111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096320.82122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096320.82148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096320.82151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096320.82154: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096320.82257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.82326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096320.82485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.82583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096320.84507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096320.84511: stdout chunk (state=3): >>><<< 18911 1727096320.84518: stderr chunk (state=3): >>><<< 18911 1727096320.84587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096320.84591: _low_level_execute_command(): starting 18911 1727096320.84596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/AnsiballZ_ping.py && sleep 0' 18911 1727096320.86024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096320.86028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096320.86352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096320.86356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096320.86359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096320.86382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096320.86385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096320.86510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.01897: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18911 1727096321.03448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096321.03481: stderr chunk (state=3): >>><<< 18911 1727096321.03488: stdout chunk (state=3): >>><<< 18911 1727096321.03598: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096321.03617: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096321.03625: _low_level_execute_command(): starting 18911 1727096321.03630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096320.7149637-20765-152158712426250/ > /dev/null 2>&1 && sleep 0' 18911 1727096321.04504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.04508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.04511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096321.04513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096321.04515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096321.04518: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.04520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096321.04534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.04574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.04830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.06837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096321.06842: stdout chunk (state=3): >>><<< 18911 1727096321.06845: stderr chunk (state=3): >>><<< 18911 1727096321.06852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096321.06855: handler run complete 18911 1727096321.06858: attempt loop complete, returning result 18911 1727096321.06860: _execute() done 18911 1727096321.06862: dumping result to json 18911 1727096321.06864: done dumping result, returning 18911 1727096321.07208: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-09a7-aae1-000000000071] 18911 1727096321.07211: sending task result for task 0afff68d-5257-09a7-aae1-000000000071 18911 1727096321.07275: done sending task result for task 0afff68d-5257-09a7-aae1-000000000071 18911 1727096321.07279: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18911 1727096321.07365: no more pending results, returning what we have 18911 1727096321.07372: results queue empty 18911 1727096321.07373: checking for any_errors_fatal 18911 1727096321.07382: done checking for any_errors_fatal 18911 1727096321.07382: checking for max_fail_percentage 18911 1727096321.07384: done checking for max_fail_percentage 18911 1727096321.07385: checking to see if all hosts have failed and the running result is not ok 18911 1727096321.07386: done checking to see if all hosts have failed 18911 1727096321.07387: getting the remaining hosts for this loop 18911 1727096321.07388: done getting the remaining hosts for this loop 18911 1727096321.07391: getting the next task for host managed_node1 18911 1727096321.07399: done getting next task for host managed_node1 18911 1727096321.07401: ^ task is: TASK: meta (role_complete) 18911 1727096321.07403: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.07414: getting variables 18911 1727096321.07435: in VariableManager get_vars() 18911 1727096321.07551: Calling all_inventory to load vars for managed_node1 18911 1727096321.07554: Calling groups_inventory to load vars for managed_node1 18911 1727096321.07556: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.07565: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.07570: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.07573: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.10232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.11929: done with get_vars() 18911 1727096321.11956: done getting variables 18911 1727096321.12039: done queuing things up, now waiting for results queue to drain 18911 1727096321.12041: results queue empty 18911 1727096321.12042: checking for any_errors_fatal 18911 1727096321.12045: done checking for any_errors_fatal 18911 1727096321.12045: checking for max_fail_percentage 18911 1727096321.12046: done checking for max_fail_percentage 18911 1727096321.12047: checking to see if all hosts have failed and the running result is not ok 18911 1727096321.12048: done checking to see if all hosts have failed 18911 1727096321.12048: getting the remaining hosts for this loop 18911 1727096321.12049: done getting the remaining hosts for this loop 18911 1727096321.12052: getting the next task for host managed_node1 18911 1727096321.12055: done getting next task for host managed_node1 18911 1727096321.12057: ^ task is: TASK: meta (flush_handlers) 18911 1727096321.12058: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.12061: getting variables 18911 1727096321.12062: in VariableManager get_vars() 18911 1727096321.12075: Calling all_inventory to load vars for managed_node1 18911 1727096321.12078: Calling groups_inventory to load vars for managed_node1 18911 1727096321.12079: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.12084: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.12086: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.12089: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.13393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.15206: done with get_vars() 18911 1727096321.15235: done getting variables 18911 1727096321.15324: in VariableManager get_vars() 18911 1727096321.15338: Calling all_inventory to load vars for managed_node1 18911 1727096321.15341: Calling groups_inventory to load vars for managed_node1 18911 1727096321.15343: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.15348: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.15351: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.15354: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.16810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.18601: done with get_vars() 18911 1727096321.18633: done queuing things up, now waiting for results queue to drain 18911 1727096321.18636: results queue empty 18911 1727096321.18636: checking for any_errors_fatal 18911 1727096321.18638: done checking for any_errors_fatal 18911 1727096321.18639: checking for max_fail_percentage 18911 1727096321.18641: done checking for max_fail_percentage 18911 1727096321.18641: checking to see if all hosts have failed and the running result is not ok 18911 1727096321.18642: done checking to see if all hosts have failed 18911 1727096321.18643: getting the remaining hosts for this loop 18911 1727096321.18644: done getting the remaining hosts for this loop 18911 1727096321.18647: getting the next task for host managed_node1 18911 1727096321.18651: done getting next task for host managed_node1 18911 1727096321.18652: ^ task is: TASK: meta (flush_handlers) 18911 1727096321.18654: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.18656: getting variables 18911 1727096321.18657: in VariableManager get_vars() 18911 1727096321.18671: Calling all_inventory to load vars for managed_node1 18911 1727096321.18673: Calling groups_inventory to load vars for managed_node1 18911 1727096321.18687: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.18694: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.18696: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.18699: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.19996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.21662: done with get_vars() 18911 1727096321.21688: done getting variables 18911 1727096321.21737: in VariableManager get_vars() 18911 1727096321.21749: Calling all_inventory to load vars for managed_node1 18911 1727096321.21751: Calling groups_inventory to load vars for managed_node1 18911 1727096321.21753: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.21758: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.21760: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.21763: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.22916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.24483: done with get_vars() 18911 1727096321.24518: done queuing things up, now waiting for results queue to drain 18911 1727096321.24520: results queue empty 18911 1727096321.24521: checking for any_errors_fatal 18911 1727096321.24522: done checking for any_errors_fatal 18911 1727096321.24523: checking for max_fail_percentage 18911 1727096321.24524: done checking for max_fail_percentage 18911 1727096321.24525: checking to see if all hosts have failed and the running result is not ok 18911 1727096321.24526: done checking to see if all hosts have failed 18911 1727096321.24526: getting the remaining hosts for this loop 18911 1727096321.24527: done getting the remaining hosts for this loop 18911 1727096321.24530: getting the next task for host managed_node1 18911 1727096321.24533: done getting next task for host managed_node1 18911 1727096321.24534: ^ task is: None 18911 1727096321.24536: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.24537: done queuing things up, now waiting for results queue to drain 18911 1727096321.24538: results queue empty 18911 1727096321.24539: checking for any_errors_fatal 18911 1727096321.24539: done checking for any_errors_fatal 18911 1727096321.24540: checking for max_fail_percentage 18911 1727096321.24541: done checking for max_fail_percentage 18911 1727096321.24542: checking to see if all hosts have failed and the running result is not ok 18911 1727096321.24542: done checking to see if all hosts have failed 18911 1727096321.24544: getting the next task for host managed_node1 18911 1727096321.24546: done getting next task for host managed_node1 18911 1727096321.24547: ^ task is: None 18911 1727096321.24548: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.24607: in VariableManager get_vars() 18911 1727096321.24623: done with get_vars() 18911 1727096321.24629: in VariableManager get_vars() 18911 1727096321.24639: done with get_vars() 18911 1727096321.24644: variable 'omit' from source: magic vars 18911 1727096321.24687: in VariableManager get_vars() 18911 1727096321.24698: done with get_vars() 18911 1727096321.24735: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18911 1727096321.24959: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096321.24990: getting the remaining hosts for this loop 18911 1727096321.24992: done getting the remaining hosts for this loop 18911 1727096321.24994: getting the next task for host managed_node1 18911 1727096321.24996: done getting next task for host managed_node1 18911 1727096321.24998: ^ task is: TASK: Gathering Facts 18911 1727096321.24999: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096321.25001: getting variables 18911 1727096321.25002: in VariableManager get_vars() 18911 1727096321.25011: Calling all_inventory to load vars for managed_node1 18911 1727096321.25013: Calling groups_inventory to load vars for managed_node1 18911 1727096321.25015: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096321.25020: Calling all_plugins_play to load vars for managed_node1 18911 1727096321.25023: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096321.25026: Calling groups_plugins_play to load vars for managed_node1 18911 1727096321.26688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096321.28305: done with get_vars() 18911 1727096321.28330: done getting variables 18911 1727096321.28379: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Monday 23 September 2024 08:58:41 -0400 (0:00:00.652) 0:00:40.398 ****** 18911 1727096321.28404: entering _queue_task() for managed_node1/gather_facts 18911 1727096321.28746: worker is 1 (out of 1 available) 18911 1727096321.28758: exiting _queue_task() for managed_node1/gather_facts 18911 1727096321.28773: done queuing things up, now waiting for results queue to drain 18911 1727096321.28775: waiting for pending results... 18911 1727096321.29182: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096321.29187: in run() - task 0afff68d-5257-09a7-aae1-0000000004e4 18911 1727096321.29191: variable 'ansible_search_path' from source: unknown 18911 1727096321.29194: calling self._execute() 18911 1727096321.29255: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096321.29268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096321.29287: variable 'omit' from source: magic vars 18911 1727096321.29722: variable 'ansible_distribution_major_version' from source: facts 18911 1727096321.29745: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096321.29757: variable 'omit' from source: magic vars 18911 1727096321.29796: variable 'omit' from source: magic vars 18911 1727096321.29838: variable 'omit' from source: magic vars 18911 1727096321.29964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096321.29970: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096321.29972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096321.30007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096321.30039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096321.30095: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096321.30104: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096321.30112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096321.30253: Set connection var ansible_shell_executable to /bin/sh 18911 1727096321.30265: Set connection var ansible_timeout to 10 18911 1727096321.30275: Set connection var ansible_shell_type to sh 18911 1727096321.30292: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096321.30308: Set connection var ansible_pipelining to False 18911 1727096321.30318: Set connection var ansible_connection to ssh 18911 1727096321.30346: variable 'ansible_shell_executable' from source: unknown 18911 1727096321.30399: variable 'ansible_connection' from source: unknown 18911 1727096321.30402: variable 'ansible_module_compression' from source: unknown 18911 1727096321.30407: variable 'ansible_shell_type' from source: unknown 18911 1727096321.30412: variable 'ansible_shell_executable' from source: unknown 18911 1727096321.30414: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096321.30417: variable 'ansible_pipelining' from source: unknown 18911 1727096321.30419: variable 'ansible_timeout' from source: unknown 18911 1727096321.30421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096321.30594: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096321.30626: variable 'omit' from source: magic vars 18911 1727096321.30727: starting attempt loop 18911 1727096321.30730: running the handler 18911 1727096321.30732: variable 'ansible_facts' from source: unknown 18911 1727096321.30734: _low_level_execute_command(): starting 18911 1727096321.30736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096321.31523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.31541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.31572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096321.31614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096321.31675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.31689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.31749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096321.31772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.31807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.31945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.33649: stdout chunk (state=3): >>>/root <<< 18911 1727096321.33804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096321.33808: stdout chunk (state=3): >>><<< 18911 1727096321.33810: stderr chunk (state=3): >>><<< 18911 1727096321.33935: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096321.33938: _low_level_execute_command(): starting 18911 1727096321.33941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247 `" && echo ansible-tmp-1727096321.338367-20795-253909089197247="` echo /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247 `" ) && sleep 0' 18911 1727096321.34497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.34519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.34536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096321.34556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096321.34595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096321.34627: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096321.34685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.34739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096321.34755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.34787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.34894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.36877: stdout chunk (state=3): >>>ansible-tmp-1727096321.338367-20795-253909089197247=/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247 <<< 18911 1727096321.37033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096321.37055: stderr chunk (state=3): >>><<< 18911 1727096321.37071: stdout chunk (state=3): >>><<< 18911 1727096321.37274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096321.338367-20795-253909089197247=/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096321.37278: variable 'ansible_module_compression' from source: unknown 18911 1727096321.37281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096321.37283: variable 'ansible_facts' from source: unknown 18911 1727096321.37486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py 18911 1727096321.37636: Sending initial data 18911 1727096321.37739: Sent initial data (153 bytes) 18911 1727096321.38296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.38312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.38328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096321.38348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096321.38371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096321.38427: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.38513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.38544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.38641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.40285: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096321.40357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096321.40446: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmplcxsysbd /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py <<< 18911 1727096321.40450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py" <<< 18911 1727096321.40504: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmplcxsysbd" to remote "/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py" <<< 18911 1727096321.42083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096321.42223: stderr chunk (state=3): >>><<< 18911 1727096321.42226: stdout chunk (state=3): >>><<< 18911 1727096321.42228: done transferring module to remote 18911 1727096321.42231: _low_level_execute_command(): starting 18911 1727096321.42233: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/ /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py && sleep 0' 18911 1727096321.42783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.42800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096321.42816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096321.42834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096321.42852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096321.42924: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.42982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096321.43001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.43031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.43137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096321.45066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096321.45104: stdout chunk (state=3): >>><<< 18911 1727096321.45108: stderr chunk (state=3): >>><<< 18911 1727096321.45212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096321.45215: _low_level_execute_command(): starting 18911 1727096321.45219: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/AnsiballZ_setup.py && sleep 0' 18911 1727096321.45813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096321.45884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096321.45956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096321.45986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096321.46002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096321.46109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.11600: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "41", "epoch": "1727096321", "epoch_int": "1727096321", "date": "2024-09-23", "time": "08:58:41", "iso8601_micro": "2024-09-23T12:58:41.741058Z", "iso8601": "2024-09-23T12:58:41Z", "iso8601_basic": "20240923T085841741058", "iso8601_basic_short": "20240923T085841", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.66162109375, "5m": 0.40771484375, "15m": 0.19921875}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2930, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 601, "free": 2930}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 474, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795487744, "block_size": 4096, "block_total": 65519099, "block_available": 63914914, "block_used": 1604185, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096322.13426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.13441: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 18911 1727096322.13505: stderr chunk (state=3): >>><<< 18911 1727096322.13517: stdout chunk (state=3): >>><<< 18911 1727096322.13679: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "41", "epoch": "1727096321", "epoch_int": "1727096321", "date": "2024-09-23", "time": "08:58:41", "iso8601_micro": "2024-09-23T12:58:41.741058Z", "iso8601": "2024-09-23T12:58:41Z", "iso8601_basic": "20240923T085841741058", "iso8601_basic_short": "20240923T085841", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.66162109375, "5m": 0.40771484375, "15m": 0.19921875}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2930, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 601, "free": 2930}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 474, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795487744, "block_size": 4096, "block_total": 65519099, "block_available": 63914914, "block_used": 1604185, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096322.14438: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096322.14515: _low_level_execute_command(): starting 18911 1727096322.14599: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096321.338367-20795-253909089197247/ > /dev/null 2>&1 && sleep 0' 18911 1727096322.15855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.15871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.15886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.16170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.18149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.18153: stdout chunk (state=3): >>><<< 18911 1727096322.18158: stderr chunk (state=3): >>><<< 18911 1727096322.18178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096322.18272: handler run complete 18911 1727096322.18311: variable 'ansible_facts' from source: unknown 18911 1727096322.18545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.19271: variable 'ansible_facts' from source: unknown 18911 1727096322.19340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.19674: attempt loop complete, returning result 18911 1727096322.19717: _execute() done 18911 1727096322.19720: dumping result to json 18911 1727096322.19752: done dumping result, returning 18911 1727096322.19756: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-0000000004e4] 18911 1727096322.19761: sending task result for task 0afff68d-5257-09a7-aae1-0000000004e4 ok: [managed_node1] 18911 1727096322.20855: no more pending results, returning what we have 18911 1727096322.20859: results queue empty 18911 1727096322.20860: checking for any_errors_fatal 18911 1727096322.20862: done checking for any_errors_fatal 18911 1727096322.20862: checking for max_fail_percentage 18911 1727096322.20869: done checking for max_fail_percentage 18911 1727096322.20870: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.20871: done checking to see if all hosts have failed 18911 1727096322.20872: getting the remaining hosts for this loop 18911 1727096322.20873: done getting the remaining hosts for this loop 18911 1727096322.20876: getting the next task for host managed_node1 18911 1727096322.20881: done getting next task for host managed_node1 18911 1727096322.20883: ^ task is: TASK: meta (flush_handlers) 18911 1727096322.20885: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.20889: getting variables 18911 1727096322.20890: in VariableManager get_vars() 18911 1727096322.20971: Calling all_inventory to load vars for managed_node1 18911 1727096322.20974: Calling groups_inventory to load vars for managed_node1 18911 1727096322.20978: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.20988: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.20991: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.20994: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.21545: done sending task result for task 0afff68d-5257-09a7-aae1-0000000004e4 18911 1727096322.21556: WORKER PROCESS EXITING 18911 1727096322.23092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.25511: done with get_vars() 18911 1727096322.25533: done getting variables 18911 1727096322.25606: in VariableManager get_vars() 18911 1727096322.25616: Calling all_inventory to load vars for managed_node1 18911 1727096322.25618: Calling groups_inventory to load vars for managed_node1 18911 1727096322.25620: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.25625: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.25627: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.25630: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.26766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.29888: done with get_vars() 18911 1727096322.29927: done queuing things up, now waiting for results queue to drain 18911 1727096322.29929: results queue empty 18911 1727096322.29930: checking for any_errors_fatal 18911 1727096322.29935: done checking for any_errors_fatal 18911 1727096322.29936: checking for max_fail_percentage 18911 1727096322.29937: done checking for max_fail_percentage 18911 1727096322.29938: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.29938: done checking to see if all hosts have failed 18911 1727096322.29944: getting the remaining hosts for this loop 18911 1727096322.29945: done getting the remaining hosts for this loop 18911 1727096322.29948: getting the next task for host managed_node1 18911 1727096322.29952: done getting next task for host managed_node1 18911 1727096322.29955: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18911 1727096322.29957: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.29959: getting variables 18911 1727096322.29960: in VariableManager get_vars() 18911 1727096322.29975: Calling all_inventory to load vars for managed_node1 18911 1727096322.29977: Calling groups_inventory to load vars for managed_node1 18911 1727096322.29979: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.29984: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.29986: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.29989: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.32657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.35976: done with get_vars() 18911 1727096322.36006: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Monday 23 September 2024 08:58:42 -0400 (0:00:01.076) 0:00:41.475 ****** 18911 1727096322.36089: entering _queue_task() for managed_node1/include_tasks 18911 1727096322.36854: worker is 1 (out of 1 available) 18911 1727096322.36871: exiting _queue_task() for managed_node1/include_tasks 18911 1727096322.36883: done queuing things up, now waiting for results queue to drain 18911 1727096322.36884: waiting for pending results... 18911 1727096322.37533: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 18911 1727096322.37577: in run() - task 0afff68d-5257-09a7-aae1-000000000074 18911 1727096322.37601: variable 'ansible_search_path' from source: unknown 18911 1727096322.37665: calling self._execute() 18911 1727096322.37956: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.37972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.37988: variable 'omit' from source: magic vars 18911 1727096322.38807: variable 'ansible_distribution_major_version' from source: facts 18911 1727096322.38831: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096322.38842: _execute() done 18911 1727096322.38899: dumping result to json 18911 1727096322.38907: done dumping result, returning 18911 1727096322.38920: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [0afff68d-5257-09a7-aae1-000000000074] 18911 1727096322.38935: sending task result for task 0afff68d-5257-09a7-aae1-000000000074 18911 1727096322.39162: done sending task result for task 0afff68d-5257-09a7-aae1-000000000074 18911 1727096322.39166: WORKER PROCESS EXITING 18911 1727096322.39195: no more pending results, returning what we have 18911 1727096322.39201: in VariableManager get_vars() 18911 1727096322.39237: Calling all_inventory to load vars for managed_node1 18911 1727096322.39241: Calling groups_inventory to load vars for managed_node1 18911 1727096322.39244: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.39260: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.39266: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.39271: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.42286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.44005: done with get_vars() 18911 1727096322.44033: variable 'ansible_search_path' from source: unknown 18911 1727096322.44050: we have included files to process 18911 1727096322.44052: generating all_blocks data 18911 1727096322.44053: done generating all_blocks data 18911 1727096322.44054: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18911 1727096322.44055: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18911 1727096322.44057: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18911 1727096322.44235: in VariableManager get_vars() 18911 1727096322.44262: done with get_vars() 18911 1727096322.44404: done processing included file 18911 1727096322.44406: iterating over new_blocks loaded from include file 18911 1727096322.44408: in VariableManager get_vars() 18911 1727096322.44419: done with get_vars() 18911 1727096322.44421: filtering new block on tags 18911 1727096322.44438: done filtering new block on tags 18911 1727096322.44440: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 18911 1727096322.44445: extending task lists for all hosts with included blocks 18911 1727096322.44514: done extending task lists 18911 1727096322.44515: done processing included files 18911 1727096322.44516: results queue empty 18911 1727096322.44517: checking for any_errors_fatal 18911 1727096322.44518: done checking for any_errors_fatal 18911 1727096322.44519: checking for max_fail_percentage 18911 1727096322.44520: done checking for max_fail_percentage 18911 1727096322.44520: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.44521: done checking to see if all hosts have failed 18911 1727096322.44522: getting the remaining hosts for this loop 18911 1727096322.44523: done getting the remaining hosts for this loop 18911 1727096322.44525: getting the next task for host managed_node1 18911 1727096322.44529: done getting next task for host managed_node1 18911 1727096322.44531: ^ task is: TASK: Include the task 'get_profile_stat.yml' 18911 1727096322.44533: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.44535: getting variables 18911 1727096322.44536: in VariableManager get_vars() 18911 1727096322.44553: Calling all_inventory to load vars for managed_node1 18911 1727096322.44555: Calling groups_inventory to load vars for managed_node1 18911 1727096322.44557: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.44566: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.44570: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.44573: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.46599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.48276: done with get_vars() 18911 1727096322.48340: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Monday 23 September 2024 08:58:42 -0400 (0:00:00.124) 0:00:41.599 ****** 18911 1727096322.48530: entering _queue_task() for managed_node1/include_tasks 18911 1727096322.49074: worker is 1 (out of 1 available) 18911 1727096322.49089: exiting _queue_task() for managed_node1/include_tasks 18911 1727096322.49102: done queuing things up, now waiting for results queue to drain 18911 1727096322.49104: waiting for pending results... 18911 1727096322.49685: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 18911 1727096322.49757: in run() - task 0afff68d-5257-09a7-aae1-0000000004f5 18911 1727096322.49788: variable 'ansible_search_path' from source: unknown 18911 1727096322.49798: variable 'ansible_search_path' from source: unknown 18911 1727096322.49874: calling self._execute() 18911 1727096322.49996: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.50011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.50070: variable 'omit' from source: magic vars 18911 1727096322.50396: variable 'ansible_distribution_major_version' from source: facts 18911 1727096322.50406: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096322.50409: _execute() done 18911 1727096322.50414: dumping result to json 18911 1727096322.50417: done dumping result, returning 18911 1727096322.50423: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-09a7-aae1-0000000004f5] 18911 1727096322.50428: sending task result for task 0afff68d-5257-09a7-aae1-0000000004f5 18911 1727096322.50514: done sending task result for task 0afff68d-5257-09a7-aae1-0000000004f5 18911 1727096322.50517: WORKER PROCESS EXITING 18911 1727096322.50542: no more pending results, returning what we have 18911 1727096322.50547: in VariableManager get_vars() 18911 1727096322.50592: Calling all_inventory to load vars for managed_node1 18911 1727096322.50596: Calling groups_inventory to load vars for managed_node1 18911 1727096322.50599: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.50613: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.50616: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.50619: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.51644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.53410: done with get_vars() 18911 1727096322.53435: variable 'ansible_search_path' from source: unknown 18911 1727096322.53437: variable 'ansible_search_path' from source: unknown 18911 1727096322.53529: we have included files to process 18911 1727096322.53530: generating all_blocks data 18911 1727096322.53532: done generating all_blocks data 18911 1727096322.53533: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18911 1727096322.53534: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18911 1727096322.53536: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18911 1727096322.54825: done processing included file 18911 1727096322.54827: iterating over new_blocks loaded from include file 18911 1727096322.54829: in VariableManager get_vars() 18911 1727096322.54844: done with get_vars() 18911 1727096322.54846: filtering new block on tags 18911 1727096322.54881: done filtering new block on tags 18911 1727096322.54885: in VariableManager get_vars() 18911 1727096322.54902: done with get_vars() 18911 1727096322.54907: filtering new block on tags 18911 1727096322.54936: done filtering new block on tags 18911 1727096322.54939: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 18911 1727096322.54943: extending task lists for all hosts with included blocks 18911 1727096322.55013: done extending task lists 18911 1727096322.55014: done processing included files 18911 1727096322.55015: results queue empty 18911 1727096322.55015: checking for any_errors_fatal 18911 1727096322.55018: done checking for any_errors_fatal 18911 1727096322.55018: checking for max_fail_percentage 18911 1727096322.55019: done checking for max_fail_percentage 18911 1727096322.55019: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.55020: done checking to see if all hosts have failed 18911 1727096322.55020: getting the remaining hosts for this loop 18911 1727096322.55021: done getting the remaining hosts for this loop 18911 1727096322.55023: getting the next task for host managed_node1 18911 1727096322.55025: done getting next task for host managed_node1 18911 1727096322.55027: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 18911 1727096322.55031: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.55033: getting variables 18911 1727096322.55033: in VariableManager get_vars() 18911 1727096322.55178: Calling all_inventory to load vars for managed_node1 18911 1727096322.55180: Calling groups_inventory to load vars for managed_node1 18911 1727096322.55182: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.55187: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.55188: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.55190: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.55875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.57539: done with get_vars() 18911 1727096322.57569: done getting variables 18911 1727096322.57616: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:58:42 -0400 (0:00:00.091) 0:00:41.690 ****** 18911 1727096322.57648: entering _queue_task() for managed_node1/set_fact 18911 1727096322.57943: worker is 1 (out of 1 available) 18911 1727096322.57959: exiting _queue_task() for managed_node1/set_fact 18911 1727096322.57976: done queuing things up, now waiting for results queue to drain 18911 1727096322.57979: waiting for pending results... 18911 1727096322.58217: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 18911 1727096322.58233: in run() - task 0afff68d-5257-09a7-aae1-000000000502 18911 1727096322.58255: variable 'ansible_search_path' from source: unknown 18911 1727096322.58259: variable 'ansible_search_path' from source: unknown 18911 1727096322.58283: calling self._execute() 18911 1727096322.58352: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.58362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.58370: variable 'omit' from source: magic vars 18911 1727096322.58651: variable 'ansible_distribution_major_version' from source: facts 18911 1727096322.58661: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096322.58669: variable 'omit' from source: magic vars 18911 1727096322.58710: variable 'omit' from source: magic vars 18911 1727096322.58735: variable 'omit' from source: magic vars 18911 1727096322.58771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096322.58803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096322.58816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096322.58829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096322.58838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096322.58862: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096322.58870: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.58873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.58940: Set connection var ansible_shell_executable to /bin/sh 18911 1727096322.58944: Set connection var ansible_timeout to 10 18911 1727096322.58946: Set connection var ansible_shell_type to sh 18911 1727096322.58953: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096322.58960: Set connection var ansible_pipelining to False 18911 1727096322.58962: Set connection var ansible_connection to ssh 18911 1727096322.58984: variable 'ansible_shell_executable' from source: unknown 18911 1727096322.58987: variable 'ansible_connection' from source: unknown 18911 1727096322.58990: variable 'ansible_module_compression' from source: unknown 18911 1727096322.58992: variable 'ansible_shell_type' from source: unknown 18911 1727096322.58994: variable 'ansible_shell_executable' from source: unknown 18911 1727096322.58996: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.58998: variable 'ansible_pipelining' from source: unknown 18911 1727096322.59003: variable 'ansible_timeout' from source: unknown 18911 1727096322.59005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.59112: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096322.59122: variable 'omit' from source: magic vars 18911 1727096322.59131: starting attempt loop 18911 1727096322.59134: running the handler 18911 1727096322.59141: handler run complete 18911 1727096322.59149: attempt loop complete, returning result 18911 1727096322.59151: _execute() done 18911 1727096322.59154: dumping result to json 18911 1727096322.59158: done dumping result, returning 18911 1727096322.59167: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-09a7-aae1-000000000502] 18911 1727096322.59171: sending task result for task 0afff68d-5257-09a7-aae1-000000000502 18911 1727096322.59247: done sending task result for task 0afff68d-5257-09a7-aae1-000000000502 18911 1727096322.59250: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 18911 1727096322.59342: no more pending results, returning what we have 18911 1727096322.59346: results queue empty 18911 1727096322.59347: checking for any_errors_fatal 18911 1727096322.59348: done checking for any_errors_fatal 18911 1727096322.59349: checking for max_fail_percentage 18911 1727096322.59350: done checking for max_fail_percentage 18911 1727096322.59351: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.59352: done checking to see if all hosts have failed 18911 1727096322.59353: getting the remaining hosts for this loop 18911 1727096322.59354: done getting the remaining hosts for this loop 18911 1727096322.59357: getting the next task for host managed_node1 18911 1727096322.59369: done getting next task for host managed_node1 18911 1727096322.59372: ^ task is: TASK: Stat profile file 18911 1727096322.59375: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.59378: getting variables 18911 1727096322.59380: in VariableManager get_vars() 18911 1727096322.59405: Calling all_inventory to load vars for managed_node1 18911 1727096322.59408: Calling groups_inventory to load vars for managed_node1 18911 1727096322.59410: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.59420: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.59422: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.59424: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.60323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096322.61212: done with get_vars() 18911 1727096322.61231: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:58:42 -0400 (0:00:00.036) 0:00:41.727 ****** 18911 1727096322.61303: entering _queue_task() for managed_node1/stat 18911 1727096322.61562: worker is 1 (out of 1 available) 18911 1727096322.61581: exiting _queue_task() for managed_node1/stat 18911 1727096322.61592: done queuing things up, now waiting for results queue to drain 18911 1727096322.61594: waiting for pending results... 18911 1727096322.61766: running TaskExecutor() for managed_node1/TASK: Stat profile file 18911 1727096322.61836: in run() - task 0afff68d-5257-09a7-aae1-000000000503 18911 1727096322.61848: variable 'ansible_search_path' from source: unknown 18911 1727096322.61852: variable 'ansible_search_path' from source: unknown 18911 1727096322.61883: calling self._execute() 18911 1727096322.61952: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.61955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.61975: variable 'omit' from source: magic vars 18911 1727096322.62255: variable 'ansible_distribution_major_version' from source: facts 18911 1727096322.62279: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096322.62284: variable 'omit' from source: magic vars 18911 1727096322.62322: variable 'omit' from source: magic vars 18911 1727096322.62400: variable 'profile' from source: include params 18911 1727096322.62406: variable 'interface' from source: set_fact 18911 1727096322.62456: variable 'interface' from source: set_fact 18911 1727096322.62477: variable 'omit' from source: magic vars 18911 1727096322.62511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096322.62538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096322.62556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096322.62573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096322.62585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096322.62608: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096322.62612: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.62616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.62689: Set connection var ansible_shell_executable to /bin/sh 18911 1727096322.62692: Set connection var ansible_timeout to 10 18911 1727096322.62695: Set connection var ansible_shell_type to sh 18911 1727096322.62706: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096322.62709: Set connection var ansible_pipelining to False 18911 1727096322.62713: Set connection var ansible_connection to ssh 18911 1727096322.62730: variable 'ansible_shell_executable' from source: unknown 18911 1727096322.62733: variable 'ansible_connection' from source: unknown 18911 1727096322.62735: variable 'ansible_module_compression' from source: unknown 18911 1727096322.62737: variable 'ansible_shell_type' from source: unknown 18911 1727096322.62740: variable 'ansible_shell_executable' from source: unknown 18911 1727096322.62743: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096322.62748: variable 'ansible_pipelining' from source: unknown 18911 1727096322.62751: variable 'ansible_timeout' from source: unknown 18911 1727096322.62753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096322.62908: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096322.62921: variable 'omit' from source: magic vars 18911 1727096322.62931: starting attempt loop 18911 1727096322.62934: running the handler 18911 1727096322.62946: _low_level_execute_command(): starting 18911 1727096322.62953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096322.63476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.63480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096322.63484: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.63541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.63545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.63547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.63626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.65353: stdout chunk (state=3): >>>/root <<< 18911 1727096322.65450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.65483: stderr chunk (state=3): >>><<< 18911 1727096322.65487: stdout chunk (state=3): >>><<< 18911 1727096322.65507: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096322.65519: _low_level_execute_command(): starting 18911 1727096322.65527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108 `" && echo ansible-tmp-1727096322.6550775-20845-107410554012108="` echo /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108 `" ) && sleep 0' 18911 1727096322.65987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.65990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.66002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.66005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.66051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.66055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.66060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.66128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.68056: stdout chunk (state=3): >>>ansible-tmp-1727096322.6550775-20845-107410554012108=/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108 <<< 18911 1727096322.68154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.68191: stderr chunk (state=3): >>><<< 18911 1727096322.68194: stdout chunk (state=3): >>><<< 18911 1727096322.68211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096322.6550775-20845-107410554012108=/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096322.68251: variable 'ansible_module_compression' from source: unknown 18911 1727096322.68302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18911 1727096322.68331: variable 'ansible_facts' from source: unknown 18911 1727096322.68395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py 18911 1727096322.68499: Sending initial data 18911 1727096322.68502: Sent initial data (153 bytes) 18911 1727096322.68941: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.68945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.68966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.69017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.69020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.69027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.69092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.70687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096322.70747: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096322.70812: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmppk7_yzc4 /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py <<< 18911 1727096322.70818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py" <<< 18911 1727096322.70882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmppk7_yzc4" to remote "/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py" <<< 18911 1727096322.70884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py" <<< 18911 1727096322.71503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.71548: stderr chunk (state=3): >>><<< 18911 1727096322.71551: stdout chunk (state=3): >>><<< 18911 1727096322.71601: done transferring module to remote 18911 1727096322.71609: _low_level_execute_command(): starting 18911 1727096322.71614: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/ /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py && sleep 0' 18911 1727096322.72071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096322.72075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096322.72077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096322.72079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.72085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.72137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.72140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.72146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.72210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.74064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.74072: stderr chunk (state=3): >>><<< 18911 1727096322.74075: stdout chunk (state=3): >>><<< 18911 1727096322.74091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096322.74094: _low_level_execute_command(): starting 18911 1727096322.74099: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/AnsiballZ_stat.py && sleep 0' 18911 1727096322.74652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096322.74656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096322.74659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.74717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096322.74737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.74799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.74887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.90550: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18911 1727096322.91786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096322.91796: stdout chunk (state=3): >>><<< 18911 1727096322.91809: stderr chunk (state=3): >>><<< 18911 1727096322.91829: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096322.91863: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096322.91885: _low_level_execute_command(): starting 18911 1727096322.91959: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096322.6550775-20845-107410554012108/ > /dev/null 2>&1 && sleep 0' 18911 1727096322.92513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096322.92526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096322.92541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096322.92559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096322.92588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096322.92600: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096322.92613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096322.92716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096322.92738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096322.92840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096322.94814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096322.94818: stdout chunk (state=3): >>><<< 18911 1727096322.94820: stderr chunk (state=3): >>><<< 18911 1727096322.94836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096322.94851: handler run complete 18911 1727096322.95075: attempt loop complete, returning result 18911 1727096322.95078: _execute() done 18911 1727096322.95080: dumping result to json 18911 1727096322.95082: done dumping result, returning 18911 1727096322.95084: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0afff68d-5257-09a7-aae1-000000000503] 18911 1727096322.95086: sending task result for task 0afff68d-5257-09a7-aae1-000000000503 18911 1727096322.95159: done sending task result for task 0afff68d-5257-09a7-aae1-000000000503 18911 1727096322.95162: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18911 1727096322.95236: no more pending results, returning what we have 18911 1727096322.95240: results queue empty 18911 1727096322.95241: checking for any_errors_fatal 18911 1727096322.95249: done checking for any_errors_fatal 18911 1727096322.95250: checking for max_fail_percentage 18911 1727096322.95252: done checking for max_fail_percentage 18911 1727096322.95253: checking to see if all hosts have failed and the running result is not ok 18911 1727096322.95254: done checking to see if all hosts have failed 18911 1727096322.95254: getting the remaining hosts for this loop 18911 1727096322.95256: done getting the remaining hosts for this loop 18911 1727096322.95259: getting the next task for host managed_node1 18911 1727096322.95272: done getting next task for host managed_node1 18911 1727096322.95275: ^ task is: TASK: Set NM profile exist flag based on the profile files 18911 1727096322.95279: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096322.95283: getting variables 18911 1727096322.95285: in VariableManager get_vars() 18911 1727096322.95316: Calling all_inventory to load vars for managed_node1 18911 1727096322.95319: Calling groups_inventory to load vars for managed_node1 18911 1727096322.95322: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096322.95334: Calling all_plugins_play to load vars for managed_node1 18911 1727096322.95337: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096322.95340: Calling groups_plugins_play to load vars for managed_node1 18911 1727096322.98291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.01636: done with get_vars() 18911 1727096323.01874: done getting variables 18911 1727096323.01940: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:58:43 -0400 (0:00:00.406) 0:00:42.134 ****** 18911 1727096323.01980: entering _queue_task() for managed_node1/set_fact 18911 1727096323.02745: worker is 1 (out of 1 available) 18911 1727096323.02757: exiting _queue_task() for managed_node1/set_fact 18911 1727096323.02775: done queuing things up, now waiting for results queue to drain 18911 1727096323.02777: waiting for pending results... 18911 1727096323.03888: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 18911 1727096323.03896: in run() - task 0afff68d-5257-09a7-aae1-000000000504 18911 1727096323.03913: variable 'ansible_search_path' from source: unknown 18911 1727096323.03916: variable 'ansible_search_path' from source: unknown 18911 1727096323.03950: calling self._execute() 18911 1727096323.04031: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.04374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.04378: variable 'omit' from source: magic vars 18911 1727096323.04892: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.05173: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.05215: variable 'profile_stat' from source: set_fact 18911 1727096323.05673: Evaluated conditional (profile_stat.stat.exists): False 18911 1727096323.05676: when evaluation is False, skipping this task 18911 1727096323.05679: _execute() done 18911 1727096323.05682: dumping result to json 18911 1727096323.05684: done dumping result, returning 18911 1727096323.05687: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-09a7-aae1-000000000504] 18911 1727096323.05689: sending task result for task 0afff68d-5257-09a7-aae1-000000000504 18911 1727096323.05758: done sending task result for task 0afff68d-5257-09a7-aae1-000000000504 18911 1727096323.05763: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18911 1727096323.05811: no more pending results, returning what we have 18911 1727096323.05815: results queue empty 18911 1727096323.05816: checking for any_errors_fatal 18911 1727096323.05824: done checking for any_errors_fatal 18911 1727096323.05825: checking for max_fail_percentage 18911 1727096323.05826: done checking for max_fail_percentage 18911 1727096323.05827: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.05828: done checking to see if all hosts have failed 18911 1727096323.05829: getting the remaining hosts for this loop 18911 1727096323.05830: done getting the remaining hosts for this loop 18911 1727096323.05833: getting the next task for host managed_node1 18911 1727096323.05840: done getting next task for host managed_node1 18911 1727096323.05843: ^ task is: TASK: Get NM profile info 18911 1727096323.05846: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.05849: getting variables 18911 1727096323.05851: in VariableManager get_vars() 18911 1727096323.06094: Calling all_inventory to load vars for managed_node1 18911 1727096323.06097: Calling groups_inventory to load vars for managed_node1 18911 1727096323.06100: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.06110: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.06113: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.06115: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.16076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.19470: done with get_vars() 18911 1727096323.19513: done getting variables 18911 1727096323.19682: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:58:43 -0400 (0:00:00.177) 0:00:42.311 ****** 18911 1727096323.19709: entering _queue_task() for managed_node1/shell 18911 1727096323.19711: Creating lock for shell 18911 1727096323.20746: worker is 1 (out of 1 available) 18911 1727096323.20759: exiting _queue_task() for managed_node1/shell 18911 1727096323.20777: done queuing things up, now waiting for results queue to drain 18911 1727096323.20779: waiting for pending results... 18911 1727096323.21653: running TaskExecutor() for managed_node1/TASK: Get NM profile info 18911 1727096323.22375: in run() - task 0afff68d-5257-09a7-aae1-000000000505 18911 1727096323.22380: variable 'ansible_search_path' from source: unknown 18911 1727096323.22383: variable 'ansible_search_path' from source: unknown 18911 1727096323.22386: calling self._execute() 18911 1727096323.22388: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.22391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.22393: variable 'omit' from source: magic vars 18911 1727096323.23285: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.23307: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.23319: variable 'omit' from source: magic vars 18911 1727096323.23573: variable 'omit' from source: magic vars 18911 1727096323.23604: variable 'profile' from source: include params 18911 1727096323.23707: variable 'interface' from source: set_fact 18911 1727096323.23783: variable 'interface' from source: set_fact 18911 1727096323.23829: variable 'omit' from source: magic vars 18911 1727096323.23958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096323.24001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096323.24046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096323.24152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096323.24171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096323.24206: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096323.24214: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.24249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.24472: Set connection var ansible_shell_executable to /bin/sh 18911 1727096323.24484: Set connection var ansible_timeout to 10 18911 1727096323.24491: Set connection var ansible_shell_type to sh 18911 1727096323.24674: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096323.24678: Set connection var ansible_pipelining to False 18911 1727096323.24680: Set connection var ansible_connection to ssh 18911 1727096323.24682: variable 'ansible_shell_executable' from source: unknown 18911 1727096323.24684: variable 'ansible_connection' from source: unknown 18911 1727096323.24686: variable 'ansible_module_compression' from source: unknown 18911 1727096323.24688: variable 'ansible_shell_type' from source: unknown 18911 1727096323.24690: variable 'ansible_shell_executable' from source: unknown 18911 1727096323.24692: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.24693: variable 'ansible_pipelining' from source: unknown 18911 1727096323.24695: variable 'ansible_timeout' from source: unknown 18911 1727096323.24697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.25109: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096323.25112: variable 'omit' from source: magic vars 18911 1727096323.25114: starting attempt loop 18911 1727096323.25116: running the handler 18911 1727096323.25118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096323.25120: _low_level_execute_command(): starting 18911 1727096323.25121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096323.26716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.26816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.26931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.28664: stdout chunk (state=3): >>>/root <<< 18911 1727096323.28844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096323.28859: stdout chunk (state=3): >>><<< 18911 1727096323.28882: stderr chunk (state=3): >>><<< 18911 1727096323.29100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096323.29105: _low_level_execute_command(): starting 18911 1727096323.29108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403 `" && echo ansible-tmp-1727096323.290004-20872-169716208226403="` echo /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403 `" ) && sleep 0' 18911 1727096323.30288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096323.30375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096323.30434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.30459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.30566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.32538: stdout chunk (state=3): >>>ansible-tmp-1727096323.290004-20872-169716208226403=/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403 <<< 18911 1727096323.32691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096323.32977: stderr chunk (state=3): >>><<< 18911 1727096323.32982: stdout chunk (state=3): >>><<< 18911 1727096323.32985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096323.290004-20872-169716208226403=/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096323.32988: variable 'ansible_module_compression' from source: unknown 18911 1727096323.32991: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096323.32994: variable 'ansible_facts' from source: unknown 18911 1727096323.33253: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py 18911 1727096323.33580: Sending initial data 18911 1727096323.33682: Sent initial data (155 bytes) 18911 1727096323.35026: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096323.35089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096323.35109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.35126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.35307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.36992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096323.37023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096323.37123: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp_qn20gt2 /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py <<< 18911 1727096323.37126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py" <<< 18911 1727096323.37334: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp_qn20gt2" to remote "/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py" <<< 18911 1727096323.38570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096323.38657: stderr chunk (state=3): >>><<< 18911 1727096323.38693: stdout chunk (state=3): >>><<< 18911 1727096323.38763: done transferring module to remote 18911 1727096323.38856: _low_level_execute_command(): starting 18911 1727096323.38865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/ /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py && sleep 0' 18911 1727096323.39636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096323.39664: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096323.39748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096323.39786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096323.39811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.39834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.39962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.41811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096323.41838: stderr chunk (state=3): >>><<< 18911 1727096323.41842: stdout chunk (state=3): >>><<< 18911 1727096323.41858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096323.41861: _low_level_execute_command(): starting 18911 1727096323.41870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/AnsiballZ_command.py && sleep 0' 18911 1727096323.42428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096323.42441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.42475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.42616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.59660: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-23 08:58:43.578815", "end": "2024-09-23 08:58:43.595040", "delta": "0:00:00.016225", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096323.61187: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. <<< 18911 1727096323.61213: stderr chunk (state=3): >>><<< 18911 1727096323.61218: stdout chunk (state=3): >>><<< 18911 1727096323.61236: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-23 08:58:43.578815", "end": "2024-09-23 08:58:43.595040", "delta": "0:00:00.016225", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. 18911 1727096323.61265: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096323.61279: _low_level_execute_command(): starting 18911 1727096323.61284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096323.290004-20872-169716208226403/ > /dev/null 2>&1 && sleep 0' 18911 1727096323.61745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096323.61748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096323.61751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096323.61753: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096323.61755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096323.61757: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096323.61817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096323.61823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096323.61825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096323.61890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096323.63765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096323.63792: stderr chunk (state=3): >>><<< 18911 1727096323.63795: stdout chunk (state=3): >>><<< 18911 1727096323.63809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096323.63815: handler run complete 18911 1727096323.63832: Evaluated conditional (False): False 18911 1727096323.63841: attempt loop complete, returning result 18911 1727096323.63845: _execute() done 18911 1727096323.63847: dumping result to json 18911 1727096323.63853: done dumping result, returning 18911 1727096323.63859: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0afff68d-5257-09a7-aae1-000000000505] 18911 1727096323.63864: sending task result for task 0afff68d-5257-09a7-aae1-000000000505 18911 1727096323.63962: done sending task result for task 0afff68d-5257-09a7-aae1-000000000505 18911 1727096323.63965: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.016225", "end": "2024-09-23 08:58:43.595040", "rc": 1, "start": "2024-09-23 08:58:43.578815" } MSG: non-zero return code ...ignoring 18911 1727096323.64037: no more pending results, returning what we have 18911 1727096323.64041: results queue empty 18911 1727096323.64042: checking for any_errors_fatal 18911 1727096323.64048: done checking for any_errors_fatal 18911 1727096323.64049: checking for max_fail_percentage 18911 1727096323.64051: done checking for max_fail_percentage 18911 1727096323.64052: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.64052: done checking to see if all hosts have failed 18911 1727096323.64053: getting the remaining hosts for this loop 18911 1727096323.64054: done getting the remaining hosts for this loop 18911 1727096323.64057: getting the next task for host managed_node1 18911 1727096323.64065: done getting next task for host managed_node1 18911 1727096323.64069: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18911 1727096323.64072: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.64077: getting variables 18911 1727096323.64079: in VariableManager get_vars() 18911 1727096323.64108: Calling all_inventory to load vars for managed_node1 18911 1727096323.64111: Calling groups_inventory to load vars for managed_node1 18911 1727096323.64114: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.64125: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.64128: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.64131: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.64957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.65890: done with get_vars() 18911 1727096323.65911: done getting variables 18911 1727096323.65959: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:58:43 -0400 (0:00:00.462) 0:00:42.774 ****** 18911 1727096323.65987: entering _queue_task() for managed_node1/set_fact 18911 1727096323.66255: worker is 1 (out of 1 available) 18911 1727096323.66271: exiting _queue_task() for managed_node1/set_fact 18911 1727096323.66283: done queuing things up, now waiting for results queue to drain 18911 1727096323.66285: waiting for pending results... 18911 1727096323.66457: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18911 1727096323.66552: in run() - task 0afff68d-5257-09a7-aae1-000000000506 18911 1727096323.66566: variable 'ansible_search_path' from source: unknown 18911 1727096323.66572: variable 'ansible_search_path' from source: unknown 18911 1727096323.66599: calling self._execute() 18911 1727096323.66670: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.66674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.66682: variable 'omit' from source: magic vars 18911 1727096323.66999: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.67010: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.67124: variable 'nm_profile_exists' from source: set_fact 18911 1727096323.67376: Evaluated conditional (nm_profile_exists.rc == 0): False 18911 1727096323.67380: when evaluation is False, skipping this task 18911 1727096323.67382: _execute() done 18911 1727096323.67385: dumping result to json 18911 1727096323.67388: done dumping result, returning 18911 1727096323.67390: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-09a7-aae1-000000000506] 18911 1727096323.67393: sending task result for task 0afff68d-5257-09a7-aae1-000000000506 18911 1727096323.67462: done sending task result for task 0afff68d-5257-09a7-aae1-000000000506 18911 1727096323.67465: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 18911 1727096323.67528: no more pending results, returning what we have 18911 1727096323.67531: results queue empty 18911 1727096323.67532: checking for any_errors_fatal 18911 1727096323.67539: done checking for any_errors_fatal 18911 1727096323.67539: checking for max_fail_percentage 18911 1727096323.67541: done checking for max_fail_percentage 18911 1727096323.67542: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.67542: done checking to see if all hosts have failed 18911 1727096323.67543: getting the remaining hosts for this loop 18911 1727096323.67544: done getting the remaining hosts for this loop 18911 1727096323.67547: getting the next task for host managed_node1 18911 1727096323.67555: done getting next task for host managed_node1 18911 1727096323.67557: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 18911 1727096323.67560: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.67563: getting variables 18911 1727096323.67567: in VariableManager get_vars() 18911 1727096323.67596: Calling all_inventory to load vars for managed_node1 18911 1727096323.67599: Calling groups_inventory to load vars for managed_node1 18911 1727096323.67602: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.67611: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.67613: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.67615: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.69234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.70324: done with get_vars() 18911 1727096323.70353: done getting variables 18911 1727096323.70492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096323.70608: variable 'profile' from source: include params 18911 1727096323.70613: variable 'interface' from source: set_fact 18911 1727096323.70678: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:58:43 -0400 (0:00:00.047) 0:00:42.821 ****** 18911 1727096323.70710: entering _queue_task() for managed_node1/command 18911 1727096323.71056: worker is 1 (out of 1 available) 18911 1727096323.71271: exiting _queue_task() for managed_node1/command 18911 1727096323.71281: done queuing things up, now waiting for results queue to drain 18911 1727096323.71283: waiting for pending results... 18911 1727096323.71361: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 18911 1727096323.71512: in run() - task 0afff68d-5257-09a7-aae1-000000000508 18911 1727096323.71533: variable 'ansible_search_path' from source: unknown 18911 1727096323.71617: variable 'ansible_search_path' from source: unknown 18911 1727096323.71621: calling self._execute() 18911 1727096323.71679: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.71689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.71704: variable 'omit' from source: magic vars 18911 1727096323.72081: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.72097: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.72248: variable 'profile_stat' from source: set_fact 18911 1727096323.72274: Evaluated conditional (profile_stat.stat.exists): False 18911 1727096323.72282: when evaluation is False, skipping this task 18911 1727096323.72290: _execute() done 18911 1727096323.72296: dumping result to json 18911 1727096323.72303: done dumping result, returning 18911 1727096323.72313: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0afff68d-5257-09a7-aae1-000000000508] 18911 1727096323.72323: sending task result for task 0afff68d-5257-09a7-aae1-000000000508 18911 1727096323.72444: done sending task result for task 0afff68d-5257-09a7-aae1-000000000508 18911 1727096323.72447: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18911 1727096323.72527: no more pending results, returning what we have 18911 1727096323.72531: results queue empty 18911 1727096323.72532: checking for any_errors_fatal 18911 1727096323.72538: done checking for any_errors_fatal 18911 1727096323.72539: checking for max_fail_percentage 18911 1727096323.72540: done checking for max_fail_percentage 18911 1727096323.72541: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.72542: done checking to see if all hosts have failed 18911 1727096323.72543: getting the remaining hosts for this loop 18911 1727096323.72545: done getting the remaining hosts for this loop 18911 1727096323.72548: getting the next task for host managed_node1 18911 1727096323.72565: done getting next task for host managed_node1 18911 1727096323.72570: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 18911 1727096323.72574: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.72579: getting variables 18911 1727096323.72580: in VariableManager get_vars() 18911 1727096323.72790: Calling all_inventory to load vars for managed_node1 18911 1727096323.72793: Calling groups_inventory to load vars for managed_node1 18911 1727096323.72796: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.72808: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.72810: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.72813: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.74340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.76160: done with get_vars() 18911 1727096323.76188: done getting variables 18911 1727096323.76256: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096323.76382: variable 'profile' from source: include params 18911 1727096323.76386: variable 'interface' from source: set_fact 18911 1727096323.76445: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:58:43 -0400 (0:00:00.057) 0:00:42.879 ****** 18911 1727096323.76479: entering _queue_task() for managed_node1/set_fact 18911 1727096323.76836: worker is 1 (out of 1 available) 18911 1727096323.76849: exiting _queue_task() for managed_node1/set_fact 18911 1727096323.76860: done queuing things up, now waiting for results queue to drain 18911 1727096323.76862: waiting for pending results... 18911 1727096323.77285: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 18911 1727096323.77289: in run() - task 0afff68d-5257-09a7-aae1-000000000509 18911 1727096323.77292: variable 'ansible_search_path' from source: unknown 18911 1727096323.77294: variable 'ansible_search_path' from source: unknown 18911 1727096323.77320: calling self._execute() 18911 1727096323.77413: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.77424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.77438: variable 'omit' from source: magic vars 18911 1727096323.78075: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.78079: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.78239: variable 'profile_stat' from source: set_fact 18911 1727096323.78311: Evaluated conditional (profile_stat.stat.exists): False 18911 1727096323.78379: when evaluation is False, skipping this task 18911 1727096323.78388: _execute() done 18911 1727096323.78404: dumping result to json 18911 1727096323.78413: done dumping result, returning 18911 1727096323.78426: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0afff68d-5257-09a7-aae1-000000000509] 18911 1727096323.78438: sending task result for task 0afff68d-5257-09a7-aae1-000000000509 18911 1727096323.78873: done sending task result for task 0afff68d-5257-09a7-aae1-000000000509 18911 1727096323.78876: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18911 1727096323.78924: no more pending results, returning what we have 18911 1727096323.78927: results queue empty 18911 1727096323.78928: checking for any_errors_fatal 18911 1727096323.78935: done checking for any_errors_fatal 18911 1727096323.78936: checking for max_fail_percentage 18911 1727096323.78938: done checking for max_fail_percentage 18911 1727096323.78939: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.78939: done checking to see if all hosts have failed 18911 1727096323.78940: getting the remaining hosts for this loop 18911 1727096323.78941: done getting the remaining hosts for this loop 18911 1727096323.78945: getting the next task for host managed_node1 18911 1727096323.78953: done getting next task for host managed_node1 18911 1727096323.78956: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 18911 1727096323.78960: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.78965: getting variables 18911 1727096323.78967: in VariableManager get_vars() 18911 1727096323.78999: Calling all_inventory to load vars for managed_node1 18911 1727096323.79002: Calling groups_inventory to load vars for managed_node1 18911 1727096323.79006: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.79019: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.79022: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.79025: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.82113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.85180: done with get_vars() 18911 1727096323.85213: done getting variables 18911 1727096323.85576: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096323.85696: variable 'profile' from source: include params 18911 1727096323.85700: variable 'interface' from source: set_fact 18911 1727096323.85756: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:58:43 -0400 (0:00:00.093) 0:00:42.972 ****** 18911 1727096323.85792: entering _queue_task() for managed_node1/command 18911 1727096323.86146: worker is 1 (out of 1 available) 18911 1727096323.86159: exiting _queue_task() for managed_node1/command 18911 1727096323.86374: done queuing things up, now waiting for results queue to drain 18911 1727096323.86377: waiting for pending results... 18911 1727096323.86447: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 18911 1727096323.86591: in run() - task 0afff68d-5257-09a7-aae1-00000000050a 18911 1727096323.86621: variable 'ansible_search_path' from source: unknown 18911 1727096323.86631: variable 'ansible_search_path' from source: unknown 18911 1727096323.86677: calling self._execute() 18911 1727096323.86778: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.86790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.86804: variable 'omit' from source: magic vars 18911 1727096323.87164: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.87185: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.87365: variable 'profile_stat' from source: set_fact 18911 1727096323.87371: Evaluated conditional (profile_stat.stat.exists): False 18911 1727096323.87374: when evaluation is False, skipping this task 18911 1727096323.87376: _execute() done 18911 1727096323.87378: dumping result to json 18911 1727096323.87381: done dumping result, returning 18911 1727096323.87383: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 [0afff68d-5257-09a7-aae1-00000000050a] 18911 1727096323.87385: sending task result for task 0afff68d-5257-09a7-aae1-00000000050a skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18911 1727096323.87633: no more pending results, returning what we have 18911 1727096323.87637: results queue empty 18911 1727096323.87639: checking for any_errors_fatal 18911 1727096323.87645: done checking for any_errors_fatal 18911 1727096323.87646: checking for max_fail_percentage 18911 1727096323.87647: done checking for max_fail_percentage 18911 1727096323.87648: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.87649: done checking to see if all hosts have failed 18911 1727096323.87650: getting the remaining hosts for this loop 18911 1727096323.87651: done getting the remaining hosts for this loop 18911 1727096323.87655: getting the next task for host managed_node1 18911 1727096323.87662: done getting next task for host managed_node1 18911 1727096323.87665: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 18911 1727096323.87671: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.87675: getting variables 18911 1727096323.87677: in VariableManager get_vars() 18911 1727096323.87707: Calling all_inventory to load vars for managed_node1 18911 1727096323.87709: Calling groups_inventory to load vars for managed_node1 18911 1727096323.87713: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.87726: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.87729: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.87732: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.88474: done sending task result for task 0afff68d-5257-09a7-aae1-00000000050a 18911 1727096323.88478: WORKER PROCESS EXITING 18911 1727096323.90861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096323.94396: done with get_vars() 18911 1727096323.94448: done getting variables 18911 1727096323.94524: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096323.94636: variable 'profile' from source: include params 18911 1727096323.94641: variable 'interface' from source: set_fact 18911 1727096323.94699: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:58:43 -0400 (0:00:00.089) 0:00:43.061 ****** 18911 1727096323.94744: entering _queue_task() for managed_node1/set_fact 18911 1727096323.95154: worker is 1 (out of 1 available) 18911 1727096323.95166: exiting _queue_task() for managed_node1/set_fact 18911 1727096323.95378: done queuing things up, now waiting for results queue to drain 18911 1727096323.95380: waiting for pending results... 18911 1727096323.95475: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 18911 1727096323.95634: in run() - task 0afff68d-5257-09a7-aae1-00000000050b 18911 1727096323.95656: variable 'ansible_search_path' from source: unknown 18911 1727096323.95665: variable 'ansible_search_path' from source: unknown 18911 1727096323.95781: calling self._execute() 18911 1727096323.95885: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096323.95899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096323.96000: variable 'omit' from source: magic vars 18911 1727096323.96286: variable 'ansible_distribution_major_version' from source: facts 18911 1727096323.96306: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096323.96440: variable 'profile_stat' from source: set_fact 18911 1727096323.96461: Evaluated conditional (profile_stat.stat.exists): False 18911 1727096323.96470: when evaluation is False, skipping this task 18911 1727096323.96477: _execute() done 18911 1727096323.96483: dumping result to json 18911 1727096323.96488: done dumping result, returning 18911 1727096323.96498: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0afff68d-5257-09a7-aae1-00000000050b] 18911 1727096323.96505: sending task result for task 0afff68d-5257-09a7-aae1-00000000050b 18911 1727096323.96775: done sending task result for task 0afff68d-5257-09a7-aae1-00000000050b 18911 1727096323.96779: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18911 1727096323.96825: no more pending results, returning what we have 18911 1727096323.96828: results queue empty 18911 1727096323.96830: checking for any_errors_fatal 18911 1727096323.96834: done checking for any_errors_fatal 18911 1727096323.96835: checking for max_fail_percentage 18911 1727096323.96837: done checking for max_fail_percentage 18911 1727096323.96838: checking to see if all hosts have failed and the running result is not ok 18911 1727096323.96839: done checking to see if all hosts have failed 18911 1727096323.96840: getting the remaining hosts for this loop 18911 1727096323.96841: done getting the remaining hosts for this loop 18911 1727096323.96845: getting the next task for host managed_node1 18911 1727096323.96855: done getting next task for host managed_node1 18911 1727096323.96858: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 18911 1727096323.96862: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096323.96868: getting variables 18911 1727096323.96872: in VariableManager get_vars() 18911 1727096323.96904: Calling all_inventory to load vars for managed_node1 18911 1727096323.96907: Calling groups_inventory to load vars for managed_node1 18911 1727096323.96910: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096323.96922: Calling all_plugins_play to load vars for managed_node1 18911 1727096323.96925: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096323.96928: Calling groups_plugins_play to load vars for managed_node1 18911 1727096323.98808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.01374: done with get_vars() 18911 1727096324.01407: done getting variables 18911 1727096324.01482: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096324.01608: variable 'profile' from source: include params 18911 1727096324.01612: variable 'interface' from source: set_fact 18911 1727096324.01679: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Monday 23 September 2024 08:58:44 -0400 (0:00:00.069) 0:00:43.131 ****** 18911 1727096324.01711: entering _queue_task() for managed_node1/assert 18911 1727096324.02211: worker is 1 (out of 1 available) 18911 1727096324.02222: exiting _queue_task() for managed_node1/assert 18911 1727096324.02232: done queuing things up, now waiting for results queue to drain 18911 1727096324.02234: waiting for pending results... 18911 1727096324.02426: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' 18911 1727096324.02654: in run() - task 0afff68d-5257-09a7-aae1-0000000004f6 18911 1727096324.02658: variable 'ansible_search_path' from source: unknown 18911 1727096324.02661: variable 'ansible_search_path' from source: unknown 18911 1727096324.02666: calling self._execute() 18911 1727096324.02743: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.02754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.02783: variable 'omit' from source: magic vars 18911 1727096324.03773: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.03777: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.03780: variable 'omit' from source: magic vars 18911 1727096324.03783: variable 'omit' from source: magic vars 18911 1727096324.03851: variable 'profile' from source: include params 18911 1727096324.03877: variable 'interface' from source: set_fact 18911 1727096324.03990: variable 'interface' from source: set_fact 18911 1727096324.04020: variable 'omit' from source: magic vars 18911 1727096324.04063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096324.04106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096324.04343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096324.04346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.04349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.04351: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096324.04354: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.04356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.04540: Set connection var ansible_shell_executable to /bin/sh 18911 1727096324.04577: Set connection var ansible_timeout to 10 18911 1727096324.04655: Set connection var ansible_shell_type to sh 18911 1727096324.04679: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096324.04690: Set connection var ansible_pipelining to False 18911 1727096324.04700: Set connection var ansible_connection to ssh 18911 1727096324.04727: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.04736: variable 'ansible_connection' from source: unknown 18911 1727096324.04766: variable 'ansible_module_compression' from source: unknown 18911 1727096324.04779: variable 'ansible_shell_type' from source: unknown 18911 1727096324.04972: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.04976: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.04978: variable 'ansible_pipelining' from source: unknown 18911 1727096324.04981: variable 'ansible_timeout' from source: unknown 18911 1727096324.04984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.05171: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096324.05190: variable 'omit' from source: magic vars 18911 1727096324.05200: starting attempt loop 18911 1727096324.05338: running the handler 18911 1727096324.05573: variable 'lsr_net_profile_exists' from source: set_fact 18911 1727096324.05576: Evaluated conditional (not lsr_net_profile_exists): True 18911 1727096324.05579: handler run complete 18911 1727096324.05642: attempt loop complete, returning result 18911 1727096324.05680: _execute() done 18911 1727096324.05719: dumping result to json 18911 1727096324.05728: done dumping result, returning 18911 1727096324.05754: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' [0afff68d-5257-09a7-aae1-0000000004f6] 18911 1727096324.05757: sending task result for task 0afff68d-5257-09a7-aae1-0000000004f6 18911 1727096324.05932: done sending task result for task 0afff68d-5257-09a7-aae1-0000000004f6 18911 1727096324.05936: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18911 1727096324.06019: no more pending results, returning what we have 18911 1727096324.06024: results queue empty 18911 1727096324.06025: checking for any_errors_fatal 18911 1727096324.06032: done checking for any_errors_fatal 18911 1727096324.06033: checking for max_fail_percentage 18911 1727096324.06035: done checking for max_fail_percentage 18911 1727096324.06036: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.06037: done checking to see if all hosts have failed 18911 1727096324.06038: getting the remaining hosts for this loop 18911 1727096324.06039: done getting the remaining hosts for this loop 18911 1727096324.06043: getting the next task for host managed_node1 18911 1727096324.06053: done getting next task for host managed_node1 18911 1727096324.06057: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18911 1727096324.06059: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.06068: getting variables 18911 1727096324.06071: in VariableManager get_vars() 18911 1727096324.06103: Calling all_inventory to load vars for managed_node1 18911 1727096324.06107: Calling groups_inventory to load vars for managed_node1 18911 1727096324.06111: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.06123: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.06126: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.06129: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.07705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.09565: done with get_vars() 18911 1727096324.09595: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Monday 23 September 2024 08:58:44 -0400 (0:00:00.079) 0:00:43.211 ****** 18911 1727096324.09686: entering _queue_task() for managed_node1/include_tasks 18911 1727096324.10077: worker is 1 (out of 1 available) 18911 1727096324.10089: exiting _queue_task() for managed_node1/include_tasks 18911 1727096324.10100: done queuing things up, now waiting for results queue to drain 18911 1727096324.10102: waiting for pending results... 18911 1727096324.10286: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 18911 1727096324.10356: in run() - task 0afff68d-5257-09a7-aae1-000000000075 18911 1727096324.10368: variable 'ansible_search_path' from source: unknown 18911 1727096324.10399: calling self._execute() 18911 1727096324.10476: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.10480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.10489: variable 'omit' from source: magic vars 18911 1727096324.10778: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.10788: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.10794: _execute() done 18911 1727096324.10798: dumping result to json 18911 1727096324.10801: done dumping result, returning 18911 1727096324.10806: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0afff68d-5257-09a7-aae1-000000000075] 18911 1727096324.10811: sending task result for task 0afff68d-5257-09a7-aae1-000000000075 18911 1727096324.10900: done sending task result for task 0afff68d-5257-09a7-aae1-000000000075 18911 1727096324.10903: WORKER PROCESS EXITING 18911 1727096324.10929: no more pending results, returning what we have 18911 1727096324.10934: in VariableManager get_vars() 18911 1727096324.10966: Calling all_inventory to load vars for managed_node1 18911 1727096324.10971: Calling groups_inventory to load vars for managed_node1 18911 1727096324.10975: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.10988: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.10991: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.10993: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.12677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.13966: done with get_vars() 18911 1727096324.13986: variable 'ansible_search_path' from source: unknown 18911 1727096324.13999: we have included files to process 18911 1727096324.14000: generating all_blocks data 18911 1727096324.14001: done generating all_blocks data 18911 1727096324.14005: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18911 1727096324.14006: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18911 1727096324.14007: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18911 1727096324.14124: in VariableManager get_vars() 18911 1727096324.14135: done with get_vars() 18911 1727096324.14217: done processing included file 18911 1727096324.14219: iterating over new_blocks loaded from include file 18911 1727096324.14220: in VariableManager get_vars() 18911 1727096324.14227: done with get_vars() 18911 1727096324.14228: filtering new block on tags 18911 1727096324.14239: done filtering new block on tags 18911 1727096324.14241: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 18911 1727096324.14245: extending task lists for all hosts with included blocks 18911 1727096324.14333: done extending task lists 18911 1727096324.14334: done processing included files 18911 1727096324.14334: results queue empty 18911 1727096324.14335: checking for any_errors_fatal 18911 1727096324.14338: done checking for any_errors_fatal 18911 1727096324.14338: checking for max_fail_percentage 18911 1727096324.14339: done checking for max_fail_percentage 18911 1727096324.14339: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.14340: done checking to see if all hosts have failed 18911 1727096324.14340: getting the remaining hosts for this loop 18911 1727096324.14341: done getting the remaining hosts for this loop 18911 1727096324.14342: getting the next task for host managed_node1 18911 1727096324.14345: done getting next task for host managed_node1 18911 1727096324.14346: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18911 1727096324.14348: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.14349: getting variables 18911 1727096324.14350: in VariableManager get_vars() 18911 1727096324.14355: Calling all_inventory to load vars for managed_node1 18911 1727096324.14357: Calling groups_inventory to load vars for managed_node1 18911 1727096324.14358: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.14362: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.14366: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.14370: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.15152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.16622: done with get_vars() 18911 1727096324.16649: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 08:58:44 -0400 (0:00:00.070) 0:00:43.281 ****** 18911 1727096324.16731: entering _queue_task() for managed_node1/include_tasks 18911 1727096324.17102: worker is 1 (out of 1 available) 18911 1727096324.17113: exiting _queue_task() for managed_node1/include_tasks 18911 1727096324.17128: done queuing things up, now waiting for results queue to drain 18911 1727096324.17131: waiting for pending results... 18911 1727096324.17376: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18911 1727096324.17453: in run() - task 0afff68d-5257-09a7-aae1-00000000053c 18911 1727096324.17464: variable 'ansible_search_path' from source: unknown 18911 1727096324.17469: variable 'ansible_search_path' from source: unknown 18911 1727096324.17500: calling self._execute() 18911 1727096324.17573: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.17579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.17588: variable 'omit' from source: magic vars 18911 1727096324.17866: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.17880: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.17885: _execute() done 18911 1727096324.17889: dumping result to json 18911 1727096324.17892: done dumping result, returning 18911 1727096324.17897: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-09a7-aae1-00000000053c] 18911 1727096324.17902: sending task result for task 0afff68d-5257-09a7-aae1-00000000053c 18911 1727096324.17996: done sending task result for task 0afff68d-5257-09a7-aae1-00000000053c 18911 1727096324.17999: WORKER PROCESS EXITING 18911 1727096324.18037: no more pending results, returning what we have 18911 1727096324.18042: in VariableManager get_vars() 18911 1727096324.18083: Calling all_inventory to load vars for managed_node1 18911 1727096324.18087: Calling groups_inventory to load vars for managed_node1 18911 1727096324.18090: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.18103: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.18106: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.18108: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.19076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.20398: done with get_vars() 18911 1727096324.20416: variable 'ansible_search_path' from source: unknown 18911 1727096324.20417: variable 'ansible_search_path' from source: unknown 18911 1727096324.20444: we have included files to process 18911 1727096324.20445: generating all_blocks data 18911 1727096324.20446: done generating all_blocks data 18911 1727096324.20447: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096324.20448: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096324.20449: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18911 1727096324.20581: done processing included file 18911 1727096324.20582: iterating over new_blocks loaded from include file 18911 1727096324.20584: in VariableManager get_vars() 18911 1727096324.20595: done with get_vars() 18911 1727096324.20596: filtering new block on tags 18911 1727096324.20606: done filtering new block on tags 18911 1727096324.20608: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18911 1727096324.20612: extending task lists for all hosts with included blocks 18911 1727096324.20672: done extending task lists 18911 1727096324.20673: done processing included files 18911 1727096324.20674: results queue empty 18911 1727096324.20674: checking for any_errors_fatal 18911 1727096324.20676: done checking for any_errors_fatal 18911 1727096324.20676: checking for max_fail_percentage 18911 1727096324.20677: done checking for max_fail_percentage 18911 1727096324.20678: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.20678: done checking to see if all hosts have failed 18911 1727096324.20679: getting the remaining hosts for this loop 18911 1727096324.20679: done getting the remaining hosts for this loop 18911 1727096324.20681: getting the next task for host managed_node1 18911 1727096324.20684: done getting next task for host managed_node1 18911 1727096324.20685: ^ task is: TASK: Get stat for interface {{ interface }} 18911 1727096324.20687: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.20688: getting variables 18911 1727096324.20689: in VariableManager get_vars() 18911 1727096324.20696: Calling all_inventory to load vars for managed_node1 18911 1727096324.20698: Calling groups_inventory to load vars for managed_node1 18911 1727096324.20700: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.20705: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.20706: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.20708: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.21375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.22233: done with get_vars() 18911 1727096324.22252: done getting variables 18911 1727096324.22372: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:58:44 -0400 (0:00:00.056) 0:00:43.338 ****** 18911 1727096324.22395: entering _queue_task() for managed_node1/stat 18911 1727096324.22663: worker is 1 (out of 1 available) 18911 1727096324.22678: exiting _queue_task() for managed_node1/stat 18911 1727096324.22688: done queuing things up, now waiting for results queue to drain 18911 1727096324.22690: waiting for pending results... 18911 1727096324.22864: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18911 1727096324.22942: in run() - task 0afff68d-5257-09a7-aae1-000000000554 18911 1727096324.22955: variable 'ansible_search_path' from source: unknown 18911 1727096324.22958: variable 'ansible_search_path' from source: unknown 18911 1727096324.22994: calling self._execute() 18911 1727096324.23063: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.23072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.23081: variable 'omit' from source: magic vars 18911 1727096324.23355: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.23359: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.23373: variable 'omit' from source: magic vars 18911 1727096324.23402: variable 'omit' from source: magic vars 18911 1727096324.23475: variable 'interface' from source: set_fact 18911 1727096324.23489: variable 'omit' from source: magic vars 18911 1727096324.23522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096324.23548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096324.23573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096324.23587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.23596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.23619: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096324.23623: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.23625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.23702: Set connection var ansible_shell_executable to /bin/sh 18911 1727096324.23705: Set connection var ansible_timeout to 10 18911 1727096324.23708: Set connection var ansible_shell_type to sh 18911 1727096324.23715: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096324.23720: Set connection var ansible_pipelining to False 18911 1727096324.23725: Set connection var ansible_connection to ssh 18911 1727096324.23742: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.23745: variable 'ansible_connection' from source: unknown 18911 1727096324.23747: variable 'ansible_module_compression' from source: unknown 18911 1727096324.23750: variable 'ansible_shell_type' from source: unknown 18911 1727096324.23752: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.23754: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.23758: variable 'ansible_pipelining' from source: unknown 18911 1727096324.23760: variable 'ansible_timeout' from source: unknown 18911 1727096324.23765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.23916: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18911 1727096324.23925: variable 'omit' from source: magic vars 18911 1727096324.23930: starting attempt loop 18911 1727096324.23933: running the handler 18911 1727096324.23944: _low_level_execute_command(): starting 18911 1727096324.23951: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096324.24463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096324.24467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096324.24472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.24484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.24541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.24545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.24547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.24630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.26339: stdout chunk (state=3): >>>/root <<< 18911 1727096324.26440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.26472: stderr chunk (state=3): >>><<< 18911 1727096324.26476: stdout chunk (state=3): >>><<< 18911 1727096324.26500: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.26513: _low_level_execute_command(): starting 18911 1727096324.26520: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224 `" && echo ansible-tmp-1727096324.2650166-20929-199300556784224="` echo /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224 `" ) && sleep 0' 18911 1727096324.26973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.26976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096324.26978: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.26988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.26991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.27036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.27042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.27045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.27109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.29027: stdout chunk (state=3): >>>ansible-tmp-1727096324.2650166-20929-199300556784224=/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224 <<< 18911 1727096324.29140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.29183: stderr chunk (state=3): >>><<< 18911 1727096324.29186: stdout chunk (state=3): >>><<< 18911 1727096324.29210: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096324.2650166-20929-199300556784224=/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.29249: variable 'ansible_module_compression' from source: unknown 18911 1727096324.29295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18911 1727096324.29329: variable 'ansible_facts' from source: unknown 18911 1727096324.29383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py 18911 1727096324.29491: Sending initial data 18911 1727096324.29495: Sent initial data (153 bytes) 18911 1727096324.29942: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.29946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096324.29948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.29951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.29956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.30011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.30019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.30020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.30087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.31671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096324.31728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096324.31795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmps10ckhj7 /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py <<< 18911 1727096324.31798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py" <<< 18911 1727096324.31861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmps10ckhj7" to remote "/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py" <<< 18911 1727096324.32457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.32500: stderr chunk (state=3): >>><<< 18911 1727096324.32504: stdout chunk (state=3): >>><<< 18911 1727096324.32527: done transferring module to remote 18911 1727096324.32536: _low_level_execute_command(): starting 18911 1727096324.32540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/ /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py && sleep 0' 18911 1727096324.32974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.32978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.32980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.32986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096324.32988: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.33032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.33035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.33106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.34948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.34952: stdout chunk (state=3): >>><<< 18911 1727096324.34954: stderr chunk (state=3): >>><<< 18911 1727096324.34971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.35052: _low_level_execute_command(): starting 18911 1727096324.35055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/AnsiballZ_stat.py && sleep 0' 18911 1727096324.35639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.35679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.35782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.51115: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18911 1727096324.52586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096324.52590: stdout chunk (state=3): >>><<< 18911 1727096324.52593: stderr chunk (state=3): >>><<< 18911 1727096324.52612: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096324.52646: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096324.52732: _low_level_execute_command(): starting 18911 1727096324.52736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096324.2650166-20929-199300556784224/ > /dev/null 2>&1 && sleep 0' 18911 1727096324.53263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096324.53281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.53296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.53315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096324.53332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096324.53345: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096324.53360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.53387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096324.53401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096324.53413: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18911 1727096324.53425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.53444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.53461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096324.53534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.53570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.53861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.55800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.55813: stdout chunk (state=3): >>><<< 18911 1727096324.55826: stderr chunk (state=3): >>><<< 18911 1727096324.55851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.55863: handler run complete 18911 1727096324.55893: attempt loop complete, returning result 18911 1727096324.55901: _execute() done 18911 1727096324.55909: dumping result to json 18911 1727096324.55917: done dumping result, returning 18911 1727096324.55930: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0afff68d-5257-09a7-aae1-000000000554] 18911 1727096324.55940: sending task result for task 0afff68d-5257-09a7-aae1-000000000554 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18911 1727096324.56135: no more pending results, returning what we have 18911 1727096324.56139: results queue empty 18911 1727096324.56140: checking for any_errors_fatal 18911 1727096324.56141: done checking for any_errors_fatal 18911 1727096324.56142: checking for max_fail_percentage 18911 1727096324.56144: done checking for max_fail_percentage 18911 1727096324.56145: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.56145: done checking to see if all hosts have failed 18911 1727096324.56146: getting the remaining hosts for this loop 18911 1727096324.56147: done getting the remaining hosts for this loop 18911 1727096324.56151: getting the next task for host managed_node1 18911 1727096324.56160: done getting next task for host managed_node1 18911 1727096324.56163: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 18911 1727096324.56169: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.56174: getting variables 18911 1727096324.56176: in VariableManager get_vars() 18911 1727096324.56300: Calling all_inventory to load vars for managed_node1 18911 1727096324.56304: Calling groups_inventory to load vars for managed_node1 18911 1727096324.56307: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.56318: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.56321: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.56324: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.56843: done sending task result for task 0afff68d-5257-09a7-aae1-000000000554 18911 1727096324.56847: WORKER PROCESS EXITING 18911 1727096324.57801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.59398: done with get_vars() 18911 1727096324.59427: done getting variables 18911 1727096324.59493: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18911 1727096324.59611: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 08:58:44 -0400 (0:00:00.372) 0:00:43.710 ****** 18911 1727096324.59642: entering _queue_task() for managed_node1/assert 18911 1727096324.60007: worker is 1 (out of 1 available) 18911 1727096324.60020: exiting _queue_task() for managed_node1/assert 18911 1727096324.60031: done queuing things up, now waiting for results queue to drain 18911 1727096324.60033: waiting for pending results... 18911 1727096324.60322: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' 18911 1727096324.60435: in run() - task 0afff68d-5257-09a7-aae1-00000000053d 18911 1727096324.60457: variable 'ansible_search_path' from source: unknown 18911 1727096324.60465: variable 'ansible_search_path' from source: unknown 18911 1727096324.60513: calling self._execute() 18911 1727096324.60609: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.60620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.60635: variable 'omit' from source: magic vars 18911 1727096324.61000: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.61019: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.61032: variable 'omit' from source: magic vars 18911 1727096324.61085: variable 'omit' from source: magic vars 18911 1727096324.61199: variable 'interface' from source: set_fact 18911 1727096324.61226: variable 'omit' from source: magic vars 18911 1727096324.61283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096324.61352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096324.61386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096324.61409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.61425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.61462: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096324.61476: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.61510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.61626: Set connection var ansible_shell_executable to /bin/sh 18911 1727096324.61638: Set connection var ansible_timeout to 10 18911 1727096324.61671: Set connection var ansible_shell_type to sh 18911 1727096324.61675: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096324.61677: Set connection var ansible_pipelining to False 18911 1727096324.61679: Set connection var ansible_connection to ssh 18911 1727096324.61715: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.61872: variable 'ansible_connection' from source: unknown 18911 1727096324.61875: variable 'ansible_module_compression' from source: unknown 18911 1727096324.61878: variable 'ansible_shell_type' from source: unknown 18911 1727096324.61880: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.61882: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.61884: variable 'ansible_pipelining' from source: unknown 18911 1727096324.61886: variable 'ansible_timeout' from source: unknown 18911 1727096324.61888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.61915: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096324.61932: variable 'omit' from source: magic vars 18911 1727096324.61942: starting attempt loop 18911 1727096324.61950: running the handler 18911 1727096324.62111: variable 'interface_stat' from source: set_fact 18911 1727096324.62133: Evaluated conditional (not interface_stat.stat.exists): True 18911 1727096324.62144: handler run complete 18911 1727096324.62161: attempt loop complete, returning result 18911 1727096324.62170: _execute() done 18911 1727096324.62178: dumping result to json 18911 1727096324.62186: done dumping result, returning 18911 1727096324.62198: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' [0afff68d-5257-09a7-aae1-00000000053d] 18911 1727096324.62207: sending task result for task 0afff68d-5257-09a7-aae1-00000000053d ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18911 1727096324.62380: no more pending results, returning what we have 18911 1727096324.62389: results queue empty 18911 1727096324.62390: checking for any_errors_fatal 18911 1727096324.62401: done checking for any_errors_fatal 18911 1727096324.62402: checking for max_fail_percentage 18911 1727096324.62404: done checking for max_fail_percentage 18911 1727096324.62405: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.62406: done checking to see if all hosts have failed 18911 1727096324.62406: getting the remaining hosts for this loop 18911 1727096324.62408: done getting the remaining hosts for this loop 18911 1727096324.62412: getting the next task for host managed_node1 18911 1727096324.62421: done getting next task for host managed_node1 18911 1727096324.62424: ^ task is: TASK: meta (flush_handlers) 18911 1727096324.62426: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.62429: getting variables 18911 1727096324.62431: in VariableManager get_vars() 18911 1727096324.62464: Calling all_inventory to load vars for managed_node1 18911 1727096324.62467: Calling groups_inventory to load vars for managed_node1 18911 1727096324.62472: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.62484: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.62487: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.62490: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.63213: done sending task result for task 0afff68d-5257-09a7-aae1-00000000053d 18911 1727096324.63217: WORKER PROCESS EXITING 18911 1727096324.64303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.65817: done with get_vars() 18911 1727096324.65846: done getting variables 18911 1727096324.65919: in VariableManager get_vars() 18911 1727096324.65930: Calling all_inventory to load vars for managed_node1 18911 1727096324.65932: Calling groups_inventory to load vars for managed_node1 18911 1727096324.65935: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.65939: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.65941: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.65944: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.68158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.69800: done with get_vars() 18911 1727096324.69833: done queuing things up, now waiting for results queue to drain 18911 1727096324.69835: results queue empty 18911 1727096324.69836: checking for any_errors_fatal 18911 1727096324.69840: done checking for any_errors_fatal 18911 1727096324.69841: checking for max_fail_percentage 18911 1727096324.69842: done checking for max_fail_percentage 18911 1727096324.69843: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.69844: done checking to see if all hosts have failed 18911 1727096324.69850: getting the remaining hosts for this loop 18911 1727096324.69851: done getting the remaining hosts for this loop 18911 1727096324.69854: getting the next task for host managed_node1 18911 1727096324.69858: done getting next task for host managed_node1 18911 1727096324.69859: ^ task is: TASK: meta (flush_handlers) 18911 1727096324.69861: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.69863: getting variables 18911 1727096324.69864: in VariableManager get_vars() 18911 1727096324.69876: Calling all_inventory to load vars for managed_node1 18911 1727096324.69878: Calling groups_inventory to load vars for managed_node1 18911 1727096324.69880: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.69886: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.69888: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.69890: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.71732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.74011: done with get_vars() 18911 1727096324.74034: done getting variables 18911 1727096324.74087: in VariableManager get_vars() 18911 1727096324.74096: Calling all_inventory to load vars for managed_node1 18911 1727096324.74098: Calling groups_inventory to load vars for managed_node1 18911 1727096324.74101: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.74106: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.74108: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.74110: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.76392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.79741: done with get_vars() 18911 1727096324.79785: done queuing things up, now waiting for results queue to drain 18911 1727096324.79788: results queue empty 18911 1727096324.79789: checking for any_errors_fatal 18911 1727096324.79790: done checking for any_errors_fatal 18911 1727096324.79791: checking for max_fail_percentage 18911 1727096324.79792: done checking for max_fail_percentage 18911 1727096324.79793: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.79794: done checking to see if all hosts have failed 18911 1727096324.79795: getting the remaining hosts for this loop 18911 1727096324.79796: done getting the remaining hosts for this loop 18911 1727096324.79798: getting the next task for host managed_node1 18911 1727096324.79802: done getting next task for host managed_node1 18911 1727096324.79802: ^ task is: None 18911 1727096324.79804: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.79805: done queuing things up, now waiting for results queue to drain 18911 1727096324.79806: results queue empty 18911 1727096324.79807: checking for any_errors_fatal 18911 1727096324.79807: done checking for any_errors_fatal 18911 1727096324.79808: checking for max_fail_percentage 18911 1727096324.79809: done checking for max_fail_percentage 18911 1727096324.79810: checking to see if all hosts have failed and the running result is not ok 18911 1727096324.79810: done checking to see if all hosts have failed 18911 1727096324.79812: getting the next task for host managed_node1 18911 1727096324.79814: done getting next task for host managed_node1 18911 1727096324.79814: ^ task is: None 18911 1727096324.79816: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.79881: in VariableManager get_vars() 18911 1727096324.79898: done with get_vars() 18911 1727096324.79904: in VariableManager get_vars() 18911 1727096324.79913: done with get_vars() 18911 1727096324.79917: variable 'omit' from source: magic vars 18911 1727096324.79947: in VariableManager get_vars() 18911 1727096324.79956: done with get_vars() 18911 1727096324.79979: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18911 1727096324.80214: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18911 1727096324.80237: getting the remaining hosts for this loop 18911 1727096324.80238: done getting the remaining hosts for this loop 18911 1727096324.80242: getting the next task for host managed_node1 18911 1727096324.80245: done getting next task for host managed_node1 18911 1727096324.80247: ^ task is: TASK: Gathering Facts 18911 1727096324.80248: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096324.80250: getting variables 18911 1727096324.80251: in VariableManager get_vars() 18911 1727096324.80260: Calling all_inventory to load vars for managed_node1 18911 1727096324.80262: Calling groups_inventory to load vars for managed_node1 18911 1727096324.80264: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096324.80272: Calling all_plugins_play to load vars for managed_node1 18911 1727096324.80274: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096324.80277: Calling groups_plugins_play to load vars for managed_node1 18911 1727096324.81549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096324.83125: done with get_vars() 18911 1727096324.83149: done getting variables 18911 1727096324.83195: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Monday 23 September 2024 08:58:44 -0400 (0:00:00.235) 0:00:43.946 ****** 18911 1727096324.83221: entering _queue_task() for managed_node1/gather_facts 18911 1727096324.83958: worker is 1 (out of 1 available) 18911 1727096324.83972: exiting _queue_task() for managed_node1/gather_facts 18911 1727096324.83983: done queuing things up, now waiting for results queue to drain 18911 1727096324.83985: waiting for pending results... 18911 1727096324.84554: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18911 1727096324.84559: in run() - task 0afff68d-5257-09a7-aae1-00000000056d 18911 1727096324.84575: variable 'ansible_search_path' from source: unknown 18911 1727096324.84612: calling self._execute() 18911 1727096324.84774: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.84790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.84813: variable 'omit' from source: magic vars 18911 1727096324.85253: variable 'ansible_distribution_major_version' from source: facts 18911 1727096324.85278: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096324.85289: variable 'omit' from source: magic vars 18911 1727096324.85318: variable 'omit' from source: magic vars 18911 1727096324.85372: variable 'omit' from source: magic vars 18911 1727096324.85420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096324.85474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096324.85559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096324.85563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.85566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096324.85583: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096324.85593: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.85601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.85720: Set connection var ansible_shell_executable to /bin/sh 18911 1727096324.85733: Set connection var ansible_timeout to 10 18911 1727096324.85740: Set connection var ansible_shell_type to sh 18911 1727096324.85753: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096324.85763: Set connection var ansible_pipelining to False 18911 1727096324.85778: Set connection var ansible_connection to ssh 18911 1727096324.85872: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.85877: variable 'ansible_connection' from source: unknown 18911 1727096324.85883: variable 'ansible_module_compression' from source: unknown 18911 1727096324.85886: variable 'ansible_shell_type' from source: unknown 18911 1727096324.85888: variable 'ansible_shell_executable' from source: unknown 18911 1727096324.85890: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096324.85892: variable 'ansible_pipelining' from source: unknown 18911 1727096324.85894: variable 'ansible_timeout' from source: unknown 18911 1727096324.85896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096324.86055: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096324.86075: variable 'omit' from source: magic vars 18911 1727096324.86086: starting attempt loop 18911 1727096324.86094: running the handler 18911 1727096324.86125: variable 'ansible_facts' from source: unknown 18911 1727096324.86150: _low_level_execute_command(): starting 18911 1727096324.86162: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096324.86998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.87025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.87043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.87212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.87402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.89127: stdout chunk (state=3): >>>/root <<< 18911 1727096324.89289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.89292: stdout chunk (state=3): >>><<< 18911 1727096324.89295: stderr chunk (state=3): >>><<< 18911 1727096324.89317: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.89337: _low_level_execute_command(): starting 18911 1727096324.89346: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561 `" && echo ansible-tmp-1727096324.893248-20951-227477918075561="` echo /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561 `" ) && sleep 0' 18911 1727096324.89963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096324.89982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.89999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.90025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096324.90043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 18911 1727096324.90076: stderr chunk (state=3): >>>debug2: match not found <<< 18911 1727096324.90128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096324.90133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.90196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.90230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.90257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.90455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.92370: stdout chunk (state=3): >>>ansible-tmp-1727096324.893248-20951-227477918075561=/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561 <<< 18911 1727096324.92537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.92540: stdout chunk (state=3): >>><<< 18911 1727096324.92543: stderr chunk (state=3): >>><<< 18911 1727096324.92773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096324.893248-20951-227477918075561=/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096324.92776: variable 'ansible_module_compression' from source: unknown 18911 1727096324.92779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18911 1727096324.92781: variable 'ansible_facts' from source: unknown 18911 1727096324.92943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py 18911 1727096324.93133: Sending initial data 18911 1727096324.93143: Sent initial data (153 bytes) 18911 1727096324.93799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096324.93813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.93883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.93933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.93952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096324.93980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.94085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096324.95706: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18911 1727096324.95720: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18911 1727096324.95746: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 18911 1727096324.95777: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096324.95894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096324.95971: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpmjr2m_9p /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py <<< 18911 1727096324.95974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py" <<< 18911 1727096324.96051: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpmjr2m_9p" to remote "/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py" <<< 18911 1727096324.98755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096324.98845: stderr chunk (state=3): >>><<< 18911 1727096324.98859: stdout chunk (state=3): >>><<< 18911 1727096324.98893: done transferring module to remote 18911 1727096324.98997: _low_level_execute_command(): starting 18911 1727096324.99002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/ /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py && sleep 0' 18911 1727096324.99536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096324.99550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096324.99570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096324.99623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096324.99626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096324.99698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.01562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096325.01594: stderr chunk (state=3): >>><<< 18911 1727096325.01596: stdout chunk (state=3): >>><<< 18911 1727096325.01607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096325.01670: _low_level_execute_command(): starting 18911 1727096325.01674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/AnsiballZ_setup.py && sleep 0' 18911 1727096325.02050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096325.02053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 18911 1727096325.02055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18911 1727096325.02057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096325.02059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.02107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096325.02113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.02188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.65778: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "45", "epoch": "1727096325", "epoch_int": "1727096325", "date": "2024-09-23", "time": "08:58:45", "iso8601_micro": "2024-09-23T12:58:45.298938Z", "iso8601": "2024-09-23T12:58:45Z", "iso8601_basic": "20240923T085845298938", "iso8601_basic_short": "20240923T085845", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/<<< 18911 1727096325.65791: stdout chunk (state=3): >>>bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 478, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795459072, "block_size": 4096, "block_total": 65519099, "block_available": 63914907, "block_used": 1604192, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv<<< 18911 1727096325.65795: stdout chunk (state=3): >>>4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_loadavg": {"1m": 0.6083984375, "5m": 0.40087890625, "15m": 0.19775390625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18911 1727096325.67855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096325.67891: stderr chunk (state=3): >>><<< 18911 1727096325.67894: stdout chunk (state=3): >>><<< 18911 1727096325.67927: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "58", "second": "45", "epoch": "1727096325", "epoch_int": "1727096325", "date": "2024-09-23", "time": "08:58:45", "iso8601_micro": "2024-09-23T12:58:45.298938Z", "iso8601": "2024-09-23T12:58:45Z", "iso8601_basic": "20240923T085845298938", "iso8601_basic_short": "20240923T085845", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2947, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 584, "free": 2947}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 478, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795459072, "block_size": 4096, "block_total": 65519099, "block_available": 63914907, "block_used": 1604192, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_loadavg": {"1m": 0.6083984375, "5m": 0.40087890625, "15m": 0.19775390625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096325.68163: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096325.68186: _low_level_execute_command(): starting 18911 1727096325.68189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096324.893248-20951-227477918075561/ > /dev/null 2>&1 && sleep 0' 18911 1727096325.68655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096325.68659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.68661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096325.68666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096325.68671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.68720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096325.68723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096325.68728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.68797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.70702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096325.70729: stderr chunk (state=3): >>><<< 18911 1727096325.70732: stdout chunk (state=3): >>><<< 18911 1727096325.70746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096325.70754: handler run complete 18911 1727096325.70839: variable 'ansible_facts' from source: unknown 18911 1727096325.70916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.71101: variable 'ansible_facts' from source: unknown 18911 1727096325.71151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.71228: attempt loop complete, returning result 18911 1727096325.71231: _execute() done 18911 1727096325.71234: dumping result to json 18911 1727096325.71255: done dumping result, returning 18911 1727096325.71262: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-09a7-aae1-00000000056d] 18911 1727096325.71269: sending task result for task 0afff68d-5257-09a7-aae1-00000000056d 18911 1727096325.71554: done sending task result for task 0afff68d-5257-09a7-aae1-00000000056d 18911 1727096325.71557: WORKER PROCESS EXITING ok: [managed_node1] 18911 1727096325.71790: no more pending results, returning what we have 18911 1727096325.71793: results queue empty 18911 1727096325.71793: checking for any_errors_fatal 18911 1727096325.71794: done checking for any_errors_fatal 18911 1727096325.71795: checking for max_fail_percentage 18911 1727096325.71796: done checking for max_fail_percentage 18911 1727096325.71797: checking to see if all hosts have failed and the running result is not ok 18911 1727096325.71797: done checking to see if all hosts have failed 18911 1727096325.71798: getting the remaining hosts for this loop 18911 1727096325.71799: done getting the remaining hosts for this loop 18911 1727096325.71801: getting the next task for host managed_node1 18911 1727096325.71805: done getting next task for host managed_node1 18911 1727096325.71806: ^ task is: TASK: meta (flush_handlers) 18911 1727096325.71807: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096325.71810: getting variables 18911 1727096325.71811: in VariableManager get_vars() 18911 1727096325.71828: Calling all_inventory to load vars for managed_node1 18911 1727096325.71829: Calling groups_inventory to load vars for managed_node1 18911 1727096325.71831: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096325.71839: Calling all_plugins_play to load vars for managed_node1 18911 1727096325.71842: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096325.71843: Calling groups_plugins_play to load vars for managed_node1 18911 1727096325.72598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.77025: done with get_vars() 18911 1727096325.77046: done getting variables 18911 1727096325.77092: in VariableManager get_vars() 18911 1727096325.77099: Calling all_inventory to load vars for managed_node1 18911 1727096325.77101: Calling groups_inventory to load vars for managed_node1 18911 1727096325.77102: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096325.77107: Calling all_plugins_play to load vars for managed_node1 18911 1727096325.77109: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096325.77111: Calling groups_plugins_play to load vars for managed_node1 18911 1727096325.77741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.78601: done with get_vars() 18911 1727096325.78622: done queuing things up, now waiting for results queue to drain 18911 1727096325.78624: results queue empty 18911 1727096325.78625: checking for any_errors_fatal 18911 1727096325.78629: done checking for any_errors_fatal 18911 1727096325.78635: checking for max_fail_percentage 18911 1727096325.78636: done checking for max_fail_percentage 18911 1727096325.78636: checking to see if all hosts have failed and the running result is not ok 18911 1727096325.78637: done checking to see if all hosts have failed 18911 1727096325.78637: getting the remaining hosts for this loop 18911 1727096325.78638: done getting the remaining hosts for this loop 18911 1727096325.78641: getting the next task for host managed_node1 18911 1727096325.78644: done getting next task for host managed_node1 18911 1727096325.78645: ^ task is: TASK: Verify network state restored to default 18911 1727096325.78646: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096325.78648: getting variables 18911 1727096325.78649: in VariableManager get_vars() 18911 1727096325.78655: Calling all_inventory to load vars for managed_node1 18911 1727096325.78657: Calling groups_inventory to load vars for managed_node1 18911 1727096325.78658: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096325.78663: Calling all_plugins_play to load vars for managed_node1 18911 1727096325.78665: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096325.78668: Calling groups_plugins_play to load vars for managed_node1 18911 1727096325.79361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.80256: done with get_vars() 18911 1727096325.80281: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Monday 23 September 2024 08:58:45 -0400 (0:00:00.971) 0:00:44.917 ****** 18911 1727096325.80332: entering _queue_task() for managed_node1/include_tasks 18911 1727096325.80614: worker is 1 (out of 1 available) 18911 1727096325.80628: exiting _queue_task() for managed_node1/include_tasks 18911 1727096325.80639: done queuing things up, now waiting for results queue to drain 18911 1727096325.80641: waiting for pending results... 18911 1727096325.80817: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 18911 1727096325.80888: in run() - task 0afff68d-5257-09a7-aae1-000000000078 18911 1727096325.80900: variable 'ansible_search_path' from source: unknown 18911 1727096325.80931: calling self._execute() 18911 1727096325.81007: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096325.81014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096325.81022: variable 'omit' from source: magic vars 18911 1727096325.81324: variable 'ansible_distribution_major_version' from source: facts 18911 1727096325.81335: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096325.81341: _execute() done 18911 1727096325.81344: dumping result to json 18911 1727096325.81347: done dumping result, returning 18911 1727096325.81354: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0afff68d-5257-09a7-aae1-000000000078] 18911 1727096325.81359: sending task result for task 0afff68d-5257-09a7-aae1-000000000078 18911 1727096325.81456: done sending task result for task 0afff68d-5257-09a7-aae1-000000000078 18911 1727096325.81458: WORKER PROCESS EXITING 18911 1727096325.81490: no more pending results, returning what we have 18911 1727096325.81494: in VariableManager get_vars() 18911 1727096325.81526: Calling all_inventory to load vars for managed_node1 18911 1727096325.81528: Calling groups_inventory to load vars for managed_node1 18911 1727096325.81531: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096325.81544: Calling all_plugins_play to load vars for managed_node1 18911 1727096325.81547: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096325.81549: Calling groups_plugins_play to load vars for managed_node1 18911 1727096325.82889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.83788: done with get_vars() 18911 1727096325.83807: variable 'ansible_search_path' from source: unknown 18911 1727096325.83819: we have included files to process 18911 1727096325.83820: generating all_blocks data 18911 1727096325.83821: done generating all_blocks data 18911 1727096325.83821: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18911 1727096325.83822: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18911 1727096325.83824: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18911 1727096325.84107: done processing included file 18911 1727096325.84108: iterating over new_blocks loaded from include file 18911 1727096325.84109: in VariableManager get_vars() 18911 1727096325.84119: done with get_vars() 18911 1727096325.84120: filtering new block on tags 18911 1727096325.84130: done filtering new block on tags 18911 1727096325.84132: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 18911 1727096325.84136: extending task lists for all hosts with included blocks 18911 1727096325.84156: done extending task lists 18911 1727096325.84157: done processing included files 18911 1727096325.84157: results queue empty 18911 1727096325.84157: checking for any_errors_fatal 18911 1727096325.84159: done checking for any_errors_fatal 18911 1727096325.84159: checking for max_fail_percentage 18911 1727096325.84160: done checking for max_fail_percentage 18911 1727096325.84160: checking to see if all hosts have failed and the running result is not ok 18911 1727096325.84161: done checking to see if all hosts have failed 18911 1727096325.84161: getting the remaining hosts for this loop 18911 1727096325.84162: done getting the remaining hosts for this loop 18911 1727096325.84166: getting the next task for host managed_node1 18911 1727096325.84172: done getting next task for host managed_node1 18911 1727096325.84174: ^ task is: TASK: Check routes and DNS 18911 1727096325.84175: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096325.84177: getting variables 18911 1727096325.84177: in VariableManager get_vars() 18911 1727096325.84184: Calling all_inventory to load vars for managed_node1 18911 1727096325.84185: Calling groups_inventory to load vars for managed_node1 18911 1727096325.84187: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096325.84192: Calling all_plugins_play to load vars for managed_node1 18911 1727096325.84194: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096325.84195: Calling groups_plugins_play to load vars for managed_node1 18911 1727096325.85117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096325.86661: done with get_vars() 18911 1727096325.86699: done getting variables 18911 1727096325.86746: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:58:45 -0400 (0:00:00.064) 0:00:44.982 ****** 18911 1727096325.86782: entering _queue_task() for managed_node1/shell 18911 1727096325.87156: worker is 1 (out of 1 available) 18911 1727096325.87176: exiting _queue_task() for managed_node1/shell 18911 1727096325.87187: done queuing things up, now waiting for results queue to drain 18911 1727096325.87189: waiting for pending results... 18911 1727096325.87584: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 18911 1727096325.87589: in run() - task 0afff68d-5257-09a7-aae1-00000000057e 18911 1727096325.87592: variable 'ansible_search_path' from source: unknown 18911 1727096325.87595: variable 'ansible_search_path' from source: unknown 18911 1727096325.87613: calling self._execute() 18911 1727096325.87718: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096325.87730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096325.87744: variable 'omit' from source: magic vars 18911 1727096325.88128: variable 'ansible_distribution_major_version' from source: facts 18911 1727096325.88251: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096325.88255: variable 'omit' from source: magic vars 18911 1727096325.88257: variable 'omit' from source: magic vars 18911 1727096325.88260: variable 'omit' from source: magic vars 18911 1727096325.88292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096325.88333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096325.88368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096325.88393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096325.88410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096325.88445: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096325.88455: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096325.88470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096325.88578: Set connection var ansible_shell_executable to /bin/sh 18911 1727096325.88675: Set connection var ansible_timeout to 10 18911 1727096325.88677: Set connection var ansible_shell_type to sh 18911 1727096325.88679: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096325.88681: Set connection var ansible_pipelining to False 18911 1727096325.88683: Set connection var ansible_connection to ssh 18911 1727096325.88684: variable 'ansible_shell_executable' from source: unknown 18911 1727096325.88686: variable 'ansible_connection' from source: unknown 18911 1727096325.88690: variable 'ansible_module_compression' from source: unknown 18911 1727096325.88692: variable 'ansible_shell_type' from source: unknown 18911 1727096325.88694: variable 'ansible_shell_executable' from source: unknown 18911 1727096325.88695: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096325.88697: variable 'ansible_pipelining' from source: unknown 18911 1727096325.88698: variable 'ansible_timeout' from source: unknown 18911 1727096325.88700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096325.88806: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096325.88831: variable 'omit' from source: magic vars 18911 1727096325.88873: starting attempt loop 18911 1727096325.88877: running the handler 18911 1727096325.88879: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096325.88891: _low_level_execute_command(): starting 18911 1727096325.88903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096325.89679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096325.89804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096325.89818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096325.89877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.89961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.91710: stdout chunk (state=3): >>>/root <<< 18911 1727096325.91809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096325.91846: stderr chunk (state=3): >>><<< 18911 1727096325.91849: stdout chunk (state=3): >>><<< 18911 1727096325.91865: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096325.91886: _low_level_execute_command(): starting 18911 1727096325.91947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071 `" && echo ansible-tmp-1727096325.9187555-20994-118335111589071="` echo /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071 `" ) && sleep 0' 18911 1727096325.92329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096325.92332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.92342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096325.92345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.92394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096325.92400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.92465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.94445: stdout chunk (state=3): >>>ansible-tmp-1727096325.9187555-20994-118335111589071=/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071 <<< 18911 1727096325.94548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096325.94583: stderr chunk (state=3): >>><<< 18911 1727096325.94585: stdout chunk (state=3): >>><<< 18911 1727096325.94600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096325.9187555-20994-118335111589071=/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096325.94678: variable 'ansible_module_compression' from source: unknown 18911 1727096325.94682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096325.94711: variable 'ansible_facts' from source: unknown 18911 1727096325.94762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py 18911 1727096325.94870: Sending initial data 18911 1727096325.94874: Sent initial data (156 bytes) 18911 1727096325.95355: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.95360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096325.95369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.95431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096325.97091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096325.97152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096325.97219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmp685ou5ir /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py <<< 18911 1727096325.97223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py" <<< 18911 1727096325.97289: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmp685ou5ir" to remote "/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py" <<< 18911 1727096325.97906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096325.97950: stderr chunk (state=3): >>><<< 18911 1727096325.97953: stdout chunk (state=3): >>><<< 18911 1727096325.97986: done transferring module to remote 18911 1727096325.97995: _low_level_execute_command(): starting 18911 1727096325.98000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/ /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py && sleep 0' 18911 1727096325.98426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096325.98430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.98442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096325.98491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096325.98510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096325.98618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.00500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.00527: stderr chunk (state=3): >>><<< 18911 1727096326.00531: stdout chunk (state=3): >>><<< 18911 1727096326.00550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.00648: _low_level_execute_command(): starting 18911 1727096326.00652: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/AnsiballZ_command.py && sleep 0' 18911 1727096326.01289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096326.01293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.01305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.01308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 18911 1727096326.01323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.01374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.17666: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3135sec preferred_lft 3135sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:58:46.166171", "end": "2024-09-23 08:58:46.174960", "delta": "0:00:00.008789", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096326.19265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096326.19295: stderr chunk (state=3): >>><<< 18911 1727096326.19298: stdout chunk (state=3): >>><<< 18911 1727096326.19317: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3135sec preferred_lft 3135sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:58:46.166171", "end": "2024-09-23 08:58:46.174960", "delta": "0:00:00.008789", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096326.19352: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096326.19360: _low_level_execute_command(): starting 18911 1727096326.19369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096325.9187555-20994-118335111589071/ > /dev/null 2>&1 && sleep 0' 18911 1727096326.19834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.19837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.19841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.19843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.19899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096326.19902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.19909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.19984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.21904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.21907: stdout chunk (state=3): >>><<< 18911 1727096326.21909: stderr chunk (state=3): >>><<< 18911 1727096326.22075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.22079: handler run complete 18911 1727096326.22082: Evaluated conditional (False): False 18911 1727096326.22084: attempt loop complete, returning result 18911 1727096326.22087: _execute() done 18911 1727096326.22089: dumping result to json 18911 1727096326.22091: done dumping result, returning 18911 1727096326.22093: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0afff68d-5257-09a7-aae1-00000000057e] 18911 1727096326.22095: sending task result for task 0afff68d-5257-09a7-aae1-00000000057e 18911 1727096326.22173: done sending task result for task 0afff68d-5257-09a7-aae1-00000000057e 18911 1727096326.22177: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008789", "end": "2024-09-23 08:58:46.174960", "rc": 0, "start": "2024-09-23 08:58:46.166171" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3135sec preferred_lft 3135sec inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 18911 1727096326.22252: no more pending results, returning what we have 18911 1727096326.22256: results queue empty 18911 1727096326.22257: checking for any_errors_fatal 18911 1727096326.22259: done checking for any_errors_fatal 18911 1727096326.22260: checking for max_fail_percentage 18911 1727096326.22262: done checking for max_fail_percentage 18911 1727096326.22263: checking to see if all hosts have failed and the running result is not ok 18911 1727096326.22263: done checking to see if all hosts have failed 18911 1727096326.22264: getting the remaining hosts for this loop 18911 1727096326.22265: done getting the remaining hosts for this loop 18911 1727096326.22271: getting the next task for host managed_node1 18911 1727096326.22278: done getting next task for host managed_node1 18911 1727096326.22281: ^ task is: TASK: Verify DNS and network connectivity 18911 1727096326.22284: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096326.22377: getting variables 18911 1727096326.22380: in VariableManager get_vars() 18911 1727096326.22412: Calling all_inventory to load vars for managed_node1 18911 1727096326.22419: Calling groups_inventory to load vars for managed_node1 18911 1727096326.22423: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096326.22435: Calling all_plugins_play to load vars for managed_node1 18911 1727096326.22438: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096326.22440: Calling groups_plugins_play to load vars for managed_node1 18911 1727096326.23539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096326.24538: done with get_vars() 18911 1727096326.24554: done getting variables 18911 1727096326.24604: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:58:46 -0400 (0:00:00.378) 0:00:45.360 ****** 18911 1727096326.24628: entering _queue_task() for managed_node1/shell 18911 1727096326.25296: worker is 1 (out of 1 available) 18911 1727096326.25304: exiting _queue_task() for managed_node1/shell 18911 1727096326.25314: done queuing things up, now waiting for results queue to drain 18911 1727096326.25315: waiting for pending results... 18911 1727096326.25548: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 18911 1727096326.25553: in run() - task 0afff68d-5257-09a7-aae1-00000000057f 18911 1727096326.25555: variable 'ansible_search_path' from source: unknown 18911 1727096326.25557: variable 'ansible_search_path' from source: unknown 18911 1727096326.25563: calling self._execute() 18911 1727096326.25658: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096326.25675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096326.25689: variable 'omit' from source: magic vars 18911 1727096326.26081: variable 'ansible_distribution_major_version' from source: facts 18911 1727096326.26099: Evaluated conditional (ansible_distribution_major_version != '6'): True 18911 1727096326.26243: variable 'ansible_facts' from source: unknown 18911 1727096326.27021: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 18911 1727096326.27033: variable 'omit' from source: magic vars 18911 1727096326.27087: variable 'omit' from source: magic vars 18911 1727096326.27122: variable 'omit' from source: magic vars 18911 1727096326.27173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18911 1727096326.27212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18911 1727096326.27236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18911 1727096326.27273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096326.27276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18911 1727096326.27326: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18911 1727096326.27381: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096326.27384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096326.27451: Set connection var ansible_shell_executable to /bin/sh 18911 1727096326.27462: Set connection var ansible_timeout to 10 18911 1727096326.27475: Set connection var ansible_shell_type to sh 18911 1727096326.27490: Set connection var ansible_module_compression to ZIP_DEFLATED 18911 1727096326.27500: Set connection var ansible_pipelining to False 18911 1727096326.27509: Set connection var ansible_connection to ssh 18911 1727096326.27534: variable 'ansible_shell_executable' from source: unknown 18911 1727096326.27541: variable 'ansible_connection' from source: unknown 18911 1727096326.27597: variable 'ansible_module_compression' from source: unknown 18911 1727096326.27600: variable 'ansible_shell_type' from source: unknown 18911 1727096326.27602: variable 'ansible_shell_executable' from source: unknown 18911 1727096326.27604: variable 'ansible_host' from source: host vars for 'managed_node1' 18911 1727096326.27607: variable 'ansible_pipelining' from source: unknown 18911 1727096326.27608: variable 'ansible_timeout' from source: unknown 18911 1727096326.27611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18911 1727096326.27733: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096326.27748: variable 'omit' from source: magic vars 18911 1727096326.27757: starting attempt loop 18911 1727096326.27766: running the handler 18911 1727096326.27783: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18911 1727096326.27804: _low_level_execute_command(): starting 18911 1727096326.27923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18911 1727096326.28539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096326.28593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.28607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18911 1727096326.28621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 18911 1727096326.28701: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.28729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096326.28749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.28781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.28892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.30639: stdout chunk (state=3): >>>/root <<< 18911 1727096326.30777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.30789: stdout chunk (state=3): >>><<< 18911 1727096326.30803: stderr chunk (state=3): >>><<< 18911 1727096326.30831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.30939: _low_level_execute_command(): starting 18911 1727096326.30944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960 `" && echo ansible-tmp-1727096326.308392-21017-10241747358960="` echo /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960 `" ) && sleep 0' 18911 1727096326.31552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.31608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.33617: stdout chunk (state=3): >>>ansible-tmp-1727096326.308392-21017-10241747358960=/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960 <<< 18911 1727096326.33765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.33778: stdout chunk (state=3): >>><<< 18911 1727096326.33793: stderr chunk (state=3): >>><<< 18911 1727096326.33820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096326.308392-21017-10241747358960=/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.33874: variable 'ansible_module_compression' from source: unknown 18911 1727096326.33927: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18911d7od04qi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18911 1727096326.34143: variable 'ansible_facts' from source: unknown 18911 1727096326.34146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py 18911 1727096326.34400: Sending initial data 18911 1727096326.34403: Sent initial data (154 bytes) 18911 1727096326.35051: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096326.35064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096326.35150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.35187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096326.35204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.35224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.35322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.37020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18911 1727096326.37112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18911 1727096326.37187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdlbobw20 /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py <<< 18911 1727096326.37204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py" <<< 18911 1727096326.37244: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18911d7od04qi/tmpdlbobw20" to remote "/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py" <<< 18911 1727096326.38218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.38544: stderr chunk (state=3): >>><<< 18911 1727096326.38547: stdout chunk (state=3): >>><<< 18911 1727096326.38549: done transferring module to remote 18911 1727096326.38551: _low_level_execute_command(): starting 18911 1727096326.38553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/ /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py && sleep 0' 18911 1727096326.39076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096326.39089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096326.39103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.39120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096326.39192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.39242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096326.39264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.39295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.39391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.41351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.41386: stdout chunk (state=3): >>><<< 18911 1727096326.41389: stderr chunk (state=3): >>><<< 18911 1727096326.41487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.41490: _low_level_execute_command(): starting 18911 1727096326.41493: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/AnsiballZ_command.py && sleep 0' 18911 1727096326.42039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18911 1727096326.42056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18911 1727096326.42078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.42126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.42218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.42244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.42363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.77183: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6520 0 --:--:-- --:--:-- --:--:-- 6630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15667 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:58:46.576980", "end": "2024-09-23 08:58:46.769859", "delta": "0:00:00.192879", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18911 1727096326.78835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 18911 1727096326.78858: stderr chunk (state=3): >>><<< 18911 1727096326.78863: stdout chunk (state=3): >>><<< 18911 1727096326.78889: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6520 0 --:--:-- --:--:-- --:--:-- 6630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15667 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:58:46.576980", "end": "2024-09-23 08:58:46.769859", "delta": "0:00:00.192879", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 18911 1727096326.78922: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18911 1727096326.78929: _low_level_execute_command(): starting 18911 1727096326.78934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096326.308392-21017-10241747358960/ > /dev/null 2>&1 && sleep 0' 18911 1727096326.79373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18911 1727096326.79376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.79379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 18911 1727096326.79381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18911 1727096326.79383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18911 1727096326.79437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 18911 1727096326.79445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18911 1727096326.79447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18911 1727096326.79513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18911 1727096326.81419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18911 1727096326.81441: stderr chunk (state=3): >>><<< 18911 1727096326.81444: stdout chunk (state=3): >>><<< 18911 1727096326.81457: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18911 1727096326.81464: handler run complete 18911 1727096326.81489: Evaluated conditional (False): False 18911 1727096326.81497: attempt loop complete, returning result 18911 1727096326.81499: _execute() done 18911 1727096326.81502: dumping result to json 18911 1727096326.81507: done dumping result, returning 18911 1727096326.81514: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0afff68d-5257-09a7-aae1-00000000057f] 18911 1727096326.81519: sending task result for task 0afff68d-5257-09a7-aae1-00000000057f 18911 1727096326.81618: done sending task result for task 0afff68d-5257-09a7-aae1-00000000057f 18911 1727096326.81621: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.192879", "end": "2024-09-23 08:58:46.769859", "rc": 0, "start": "2024-09-23 08:58:46.576980" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6520 0 --:--:-- --:--:-- --:--:-- 6630 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 15667 0 --:--:-- --:--:-- --:--:-- 16166 18911 1727096326.81688: no more pending results, returning what we have 18911 1727096326.81691: results queue empty 18911 1727096326.81692: checking for any_errors_fatal 18911 1727096326.81699: done checking for any_errors_fatal 18911 1727096326.81700: checking for max_fail_percentage 18911 1727096326.81702: done checking for max_fail_percentage 18911 1727096326.81702: checking to see if all hosts have failed and the running result is not ok 18911 1727096326.81703: done checking to see if all hosts have failed 18911 1727096326.81704: getting the remaining hosts for this loop 18911 1727096326.81709: done getting the remaining hosts for this loop 18911 1727096326.81712: getting the next task for host managed_node1 18911 1727096326.81720: done getting next task for host managed_node1 18911 1727096326.81725: ^ task is: TASK: meta (flush_handlers) 18911 1727096326.81727: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096326.81732: getting variables 18911 1727096326.81733: in VariableManager get_vars() 18911 1727096326.81762: Calling all_inventory to load vars for managed_node1 18911 1727096326.81764: Calling groups_inventory to load vars for managed_node1 18911 1727096326.81769: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096326.81779: Calling all_plugins_play to load vars for managed_node1 18911 1727096326.81782: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096326.81785: Calling groups_plugins_play to load vars for managed_node1 18911 1727096326.83011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096326.84195: done with get_vars() 18911 1727096326.84211: done getting variables 18911 1727096326.84262: in VariableManager get_vars() 18911 1727096326.84273: Calling all_inventory to load vars for managed_node1 18911 1727096326.84275: Calling groups_inventory to load vars for managed_node1 18911 1727096326.84276: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096326.84280: Calling all_plugins_play to load vars for managed_node1 18911 1727096326.84281: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096326.84283: Calling groups_plugins_play to load vars for managed_node1 18911 1727096326.85014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096326.86461: done with get_vars() 18911 1727096326.86516: done queuing things up, now waiting for results queue to drain 18911 1727096326.86519: results queue empty 18911 1727096326.86520: checking for any_errors_fatal 18911 1727096326.86524: done checking for any_errors_fatal 18911 1727096326.86524: checking for max_fail_percentage 18911 1727096326.86525: done checking for max_fail_percentage 18911 1727096326.86526: checking to see if all hosts have failed and the running result is not ok 18911 1727096326.86527: done checking to see if all hosts have failed 18911 1727096326.86527: getting the remaining hosts for this loop 18911 1727096326.86528: done getting the remaining hosts for this loop 18911 1727096326.86531: getting the next task for host managed_node1 18911 1727096326.86535: done getting next task for host managed_node1 18911 1727096326.86537: ^ task is: TASK: meta (flush_handlers) 18911 1727096326.86539: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096326.86544: getting variables 18911 1727096326.86545: in VariableManager get_vars() 18911 1727096326.86552: Calling all_inventory to load vars for managed_node1 18911 1727096326.86555: Calling groups_inventory to load vars for managed_node1 18911 1727096326.86558: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096326.86565: Calling all_plugins_play to load vars for managed_node1 18911 1727096326.86570: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096326.86573: Calling groups_plugins_play to load vars for managed_node1 18911 1727096326.87686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096326.88548: done with get_vars() 18911 1727096326.88563: done getting variables 18911 1727096326.88602: in VariableManager get_vars() 18911 1727096326.88610: Calling all_inventory to load vars for managed_node1 18911 1727096326.88612: Calling groups_inventory to load vars for managed_node1 18911 1727096326.88613: Calling all_plugins_inventory to load vars for managed_node1 18911 1727096326.88617: Calling all_plugins_play to load vars for managed_node1 18911 1727096326.88618: Calling groups_plugins_inventory to load vars for managed_node1 18911 1727096326.88620: Calling groups_plugins_play to load vars for managed_node1 18911 1727096326.89347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18911 1727096326.90815: done with get_vars() 18911 1727096326.90847: done queuing things up, now waiting for results queue to drain 18911 1727096326.90850: results queue empty 18911 1727096326.90851: checking for any_errors_fatal 18911 1727096326.90852: done checking for any_errors_fatal 18911 1727096326.90853: checking for max_fail_percentage 18911 1727096326.90854: done checking for max_fail_percentage 18911 1727096326.90855: checking to see if all hosts have failed and the running result is not ok 18911 1727096326.90855: done checking to see if all hosts have failed 18911 1727096326.90856: getting the remaining hosts for this loop 18911 1727096326.90857: done getting the remaining hosts for this loop 18911 1727096326.90860: getting the next task for host managed_node1 18911 1727096326.90863: done getting next task for host managed_node1 18911 1727096326.90863: ^ task is: None 18911 1727096326.90865: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18911 1727096326.90866: done queuing things up, now waiting for results queue to drain 18911 1727096326.90868: results queue empty 18911 1727096326.90869: checking for any_errors_fatal 18911 1727096326.90870: done checking for any_errors_fatal 18911 1727096326.90870: checking for max_fail_percentage 18911 1727096326.90871: done checking for max_fail_percentage 18911 1727096326.90872: checking to see if all hosts have failed and the running result is not ok 18911 1727096326.90873: done checking to see if all hosts have failed 18911 1727096326.90874: getting the next task for host managed_node1 18911 1727096326.90876: done getting next task for host managed_node1 18911 1727096326.90877: ^ task is: None 18911 1727096326.90878: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Monday 23 September 2024 08:58:46 -0400 (0:00:00.663) 0:00:46.023 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.30s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.59s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.45s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.40s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Create veth interface lsr27 --------------------------------------------- 1.39s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.35s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.23s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.11s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gathering Facts --------------------------------------------------------- 1.07s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.81s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 18911 1727096326.90995: RUNNING CLEANUP