43681 1727204691.92081: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 43681 1727204691.92624: Added group all to inventory 43681 1727204691.92626: Added group ungrouped to inventory 43681 1727204691.92632: Group all now contains ungrouped 43681 1727204691.92635: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 43681 1727204692.10183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 43681 1727204692.10246: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 43681 1727204692.10268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 43681 1727204692.10322: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 43681 1727204692.10392: Loaded config def from plugin (inventory/script) 43681 1727204692.10396: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 43681 1727204692.10437: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 43681 1727204692.10512: Loaded config def from plugin (inventory/yaml) 43681 1727204692.10514: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 43681 1727204692.10591: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 43681 1727204692.10962: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 43681 1727204692.10965: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 43681 1727204692.10967: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 43681 1727204692.10972: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 43681 1727204692.10976: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 43681 1727204692.11037: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 43681 1727204692.11087: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 43681 1727204692.11126: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 43681 1727204692.11195: group all already in inventory 43681 1727204692.11201: set inventory_file for managed-node1 43681 1727204692.11204: set inventory_dir for managed-node1 43681 1727204692.11205: Added host managed-node1 to inventory 43681 1727204692.11207: Added host managed-node1 to group all 43681 1727204692.11207: set ansible_host for managed-node1 43681 1727204692.11208: set ansible_ssh_extra_args for managed-node1 43681 1727204692.11210: set inventory_file for managed-node2 43681 1727204692.11212: set inventory_dir for managed-node2 43681 1727204692.11213: Added host managed-node2 to inventory 43681 1727204692.11214: Added host managed-node2 to group all 43681 1727204692.11215: set ansible_host for managed-node2 43681 1727204692.11215: set ansible_ssh_extra_args for managed-node2 43681 1727204692.11219: set inventory_file for managed-node3 43681 1727204692.11221: set inventory_dir for managed-node3 43681 1727204692.11222: Added host managed-node3 to inventory 43681 1727204692.11223: Added host managed-node3 to group all 43681 1727204692.11223: set ansible_host for managed-node3 43681 1727204692.11224: set ansible_ssh_extra_args for managed-node3 43681 1727204692.11226: Reconcile groups and hosts in inventory. 43681 1727204692.11230: Group ungrouped now contains managed-node1 43681 1727204692.11232: Group ungrouped now contains managed-node2 43681 1727204692.11233: Group ungrouped now contains managed-node3 43681 1727204692.11297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 43681 1727204692.11403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 43681 1727204692.11445: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 43681 1727204692.11470: Loaded config def from plugin (vars/host_group_vars) 43681 1727204692.11472: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 43681 1727204692.11478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 43681 1727204692.11484: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 43681 1727204692.11522: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 43681 1727204692.11795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204692.11894: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 43681 1727204692.11951: Loaded config def from plugin (connection/local) 43681 1727204692.11954: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 43681 1727204692.12791: Loaded config def from plugin (connection/paramiko_ssh) 43681 1727204692.12795: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 43681 1727204692.13884: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 43681 1727204692.13939: Loaded config def from plugin (connection/psrp) 43681 1727204692.13942: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 43681 1727204692.14562: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 43681 1727204692.14597: Loaded config def from plugin (connection/ssh) 43681 1727204692.14599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 43681 1727204692.16242: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 43681 1727204692.16275: Loaded config def from plugin (connection/winrm) 43681 1727204692.16278: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 43681 1727204692.16305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 43681 1727204692.16363: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 43681 1727204692.16423: Loaded config def from plugin (shell/cmd) 43681 1727204692.16424: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 43681 1727204692.16446: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 43681 1727204692.16505: Loaded config def from plugin (shell/powershell) 43681 1727204692.16507: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 43681 1727204692.16552: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 43681 1727204692.16707: Loaded config def from plugin (shell/sh) 43681 1727204692.16709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 43681 1727204692.16738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 43681 1727204692.16845: Loaded config def from plugin (become/runas) 43681 1727204692.16847: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 43681 1727204692.17009: Loaded config def from plugin (become/su) 43681 1727204692.17012: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 43681 1727204692.17151: Loaded config def from plugin (become/sudo) 43681 1727204692.17153: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 43681 1727204692.17181: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 43681 1727204692.17459: in VariableManager get_vars() 43681 1727204692.17476: done with get_vars() 43681 1727204692.17587: trying /usr/local/lib/python3.12/site-packages/ansible/modules 43681 1727204692.19968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 43681 1727204692.20062: in VariableManager get_vars() 43681 1727204692.20066: done with get_vars() 43681 1727204692.20069: variable 'playbook_dir' from source: magic vars 43681 1727204692.20070: variable 'ansible_playbook_python' from source: magic vars 43681 1727204692.20070: variable 'ansible_config_file' from source: magic vars 43681 1727204692.20071: variable 'groups' from source: magic vars 43681 1727204692.20072: variable 'omit' from source: magic vars 43681 1727204692.20072: variable 'ansible_version' from source: magic vars 43681 1727204692.20073: variable 'ansible_check_mode' from source: magic vars 43681 1727204692.20073: variable 'ansible_diff_mode' from source: magic vars 43681 1727204692.20074: variable 'ansible_forks' from source: magic vars 43681 1727204692.20074: variable 'ansible_inventory_sources' from source: magic vars 43681 1727204692.20075: variable 'ansible_skip_tags' from source: magic vars 43681 1727204692.20076: variable 'ansible_limit' from source: magic vars 43681 1727204692.20076: variable 'ansible_run_tags' from source: magic vars 43681 1727204692.20077: variable 'ansible_verbosity' from source: magic vars 43681 1727204692.20108: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 43681 1727204692.20943: in VariableManager get_vars() 43681 1727204692.20956: done with get_vars() 43681 1727204692.20985: in VariableManager get_vars() 43681 1727204692.20997: done with get_vars() 43681 1727204692.21025: in VariableManager get_vars() 43681 1727204692.21036: done with get_vars() 43681 1727204692.21096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 43681 1727204692.21196: in VariableManager get_vars() 43681 1727204692.21207: done with get_vars() 43681 1727204692.21211: variable 'omit' from source: magic vars 43681 1727204692.21229: variable 'omit' from source: magic vars 43681 1727204692.21256: in VariableManager get_vars() 43681 1727204692.21266: done with get_vars() 43681 1727204692.21319: in VariableManager get_vars() 43681 1727204692.21330: done with get_vars() 43681 1727204692.21359: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 43681 1727204692.21543: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 43681 1727204692.21652: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 43681 1727204692.22231: in VariableManager get_vars() 43681 1727204692.22250: done with get_vars() 43681 1727204692.22609: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204692.26091: in VariableManager get_vars() 43681 1727204692.26094: done with get_vars() 43681 1727204692.26096: variable 'playbook_dir' from source: magic vars 43681 1727204692.26097: variable 'ansible_playbook_python' from source: magic vars 43681 1727204692.26097: variable 'ansible_config_file' from source: magic vars 43681 1727204692.26098: variable 'groups' from source: magic vars 43681 1727204692.26098: variable 'omit' from source: magic vars 43681 1727204692.26099: variable 'ansible_version' from source: magic vars 43681 1727204692.26100: variable 'ansible_check_mode' from source: magic vars 43681 1727204692.26100: variable 'ansible_diff_mode' from source: magic vars 43681 1727204692.26101: variable 'ansible_forks' from source: magic vars 43681 1727204692.26101: variable 'ansible_inventory_sources' from source: magic vars 43681 1727204692.26102: variable 'ansible_skip_tags' from source: magic vars 43681 1727204692.26102: variable 'ansible_limit' from source: magic vars 43681 1727204692.26103: variable 'ansible_run_tags' from source: magic vars 43681 1727204692.26104: variable 'ansible_verbosity' from source: magic vars 43681 1727204692.26133: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 43681 1727204692.26195: in VariableManager get_vars() 43681 1727204692.26198: done with get_vars() 43681 1727204692.26200: variable 'playbook_dir' from source: magic vars 43681 1727204692.26200: variable 'ansible_playbook_python' from source: magic vars 43681 1727204692.26201: variable 'ansible_config_file' from source: magic vars 43681 1727204692.26202: variable 'groups' from source: magic vars 43681 1727204692.26202: variable 'omit' from source: magic vars 43681 1727204692.26203: variable 'ansible_version' from source: magic vars 43681 1727204692.26203: variable 'ansible_check_mode' from source: magic vars 43681 1727204692.26204: variable 'ansible_diff_mode' from source: magic vars 43681 1727204692.26204: variable 'ansible_forks' from source: magic vars 43681 1727204692.26205: variable 'ansible_inventory_sources' from source: magic vars 43681 1727204692.26206: variable 'ansible_skip_tags' from source: magic vars 43681 1727204692.26206: variable 'ansible_limit' from source: magic vars 43681 1727204692.26207: variable 'ansible_run_tags' from source: magic vars 43681 1727204692.26207: variable 'ansible_verbosity' from source: magic vars 43681 1727204692.26235: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 43681 1727204692.26293: in VariableManager get_vars() 43681 1727204692.26304: done with get_vars() 43681 1727204692.26339: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 43681 1727204692.26440: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 43681 1727204692.26502: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 43681 1727204692.26825: in VariableManager get_vars() 43681 1727204692.26843: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204692.28187: in VariableManager get_vars() 43681 1727204692.28200: done with get_vars() 43681 1727204692.28229: in VariableManager get_vars() 43681 1727204692.28231: done with get_vars() 43681 1727204692.28233: variable 'playbook_dir' from source: magic vars 43681 1727204692.28234: variable 'ansible_playbook_python' from source: magic vars 43681 1727204692.28234: variable 'ansible_config_file' from source: magic vars 43681 1727204692.28235: variable 'groups' from source: magic vars 43681 1727204692.28235: variable 'omit' from source: magic vars 43681 1727204692.28236: variable 'ansible_version' from source: magic vars 43681 1727204692.28237: variable 'ansible_check_mode' from source: magic vars 43681 1727204692.28237: variable 'ansible_diff_mode' from source: magic vars 43681 1727204692.28238: variable 'ansible_forks' from source: magic vars 43681 1727204692.28239: variable 'ansible_inventory_sources' from source: magic vars 43681 1727204692.28240: variable 'ansible_skip_tags' from source: magic vars 43681 1727204692.28240: variable 'ansible_limit' from source: magic vars 43681 1727204692.28241: variable 'ansible_run_tags' from source: magic vars 43681 1727204692.28242: variable 'ansible_verbosity' from source: magic vars 43681 1727204692.28269: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 43681 1727204692.28335: in VariableManager get_vars() 43681 1727204692.28345: done with get_vars() 43681 1727204692.28380: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 43681 1727204692.28468: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 43681 1727204692.28530: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 43681 1727204692.28846: in VariableManager get_vars() 43681 1727204692.28860: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204692.30292: in VariableManager get_vars() 43681 1727204692.30311: done with get_vars() 43681 1727204692.30358: in VariableManager get_vars() 43681 1727204692.30374: done with get_vars() 43681 1727204692.30421: in VariableManager get_vars() 43681 1727204692.30437: done with get_vars() 43681 1727204692.30521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 43681 1727204692.30540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 43681 1727204692.30837: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 43681 1727204692.31072: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 43681 1727204692.31075: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 43681 1727204692.31116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 43681 1727204692.31152: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 43681 1727204692.31394: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 43681 1727204692.31480: Loaded config def from plugin (callback/default) 43681 1727204692.31483: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 43681 1727204692.32986: Loaded config def from plugin (callback/junit) 43681 1727204692.32991: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 43681 1727204692.33049: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 43681 1727204692.33142: Loaded config def from plugin (callback/minimal) 43681 1727204692.33145: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 43681 1727204692.33197: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 43681 1727204692.33280: Loaded config def from plugin (callback/tree) 43681 1727204692.33283: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 43681 1727204692.33441: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 43681 1727204692.33444: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 43681 1727204692.33478: in VariableManager get_vars() 43681 1727204692.33497: done with get_vars() 43681 1727204692.33505: in VariableManager get_vars() 43681 1727204692.33516: done with get_vars() 43681 1727204692.33521: variable 'omit' from source: magic vars 43681 1727204692.33573: in VariableManager get_vars() 43681 1727204692.33593: done with get_vars() 43681 1727204692.33618: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 43681 1727204692.34284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 43681 1727204692.34377: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 43681 1727204692.34412: getting the remaining hosts for this loop 43681 1727204692.34414: done getting the remaining hosts for this loop 43681 1727204692.34422: getting the next task for host managed-node3 43681 1727204692.34426: done getting next task for host managed-node3 43681 1727204692.34428: ^ task is: TASK: Gathering Facts 43681 1727204692.34430: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204692.34433: getting variables 43681 1727204692.34434: in VariableManager get_vars() 43681 1727204692.34446: Calling all_inventory to load vars for managed-node3 43681 1727204692.34449: Calling groups_inventory to load vars for managed-node3 43681 1727204692.34452: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204692.34466: Calling all_plugins_play to load vars for managed-node3 43681 1727204692.34481: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204692.34486: Calling groups_plugins_play to load vars for managed-node3 43681 1727204692.34535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204692.34630: done with get_vars() 43681 1727204692.34638: done getting variables 43681 1727204692.34788: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.014) 0:00:00.014 ***** 43681 1727204692.34807: entering _queue_task() for managed-node3/gather_facts 43681 1727204692.34808: Creating lock for gather_facts 43681 1727204692.35116: worker is 1 (out of 1 available) 43681 1727204692.35129: exiting _queue_task() for managed-node3/gather_facts 43681 1727204692.35144: done queuing things up, now waiting for results queue to drain 43681 1727204692.35146: waiting for pending results... 43681 1727204692.35306: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204692.35376: in run() - task 12b410aa-8751-9e86-7728-0000000000af 43681 1727204692.35393: variable 'ansible_search_path' from source: unknown 43681 1727204692.35429: calling self._execute() 43681 1727204692.35481: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204692.35489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204692.35507: variable 'omit' from source: magic vars 43681 1727204692.35586: variable 'omit' from source: magic vars 43681 1727204692.35619: variable 'omit' from source: magic vars 43681 1727204692.35648: variable 'omit' from source: magic vars 43681 1727204692.35684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204692.35720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204692.35741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204692.35757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204692.35767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204692.35797: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204692.35801: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204692.35806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204692.35892: Set connection var ansible_shell_type to sh 43681 1727204692.35898: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204692.35906: Set connection var ansible_timeout to 10 43681 1727204692.35915: Set connection var ansible_pipelining to False 43681 1727204692.35923: Set connection var ansible_connection to ssh 43681 1727204692.35931: Set connection var ansible_shell_executable to /bin/sh 43681 1727204692.35953: variable 'ansible_shell_executable' from source: unknown 43681 1727204692.35957: variable 'ansible_connection' from source: unknown 43681 1727204692.35959: variable 'ansible_module_compression' from source: unknown 43681 1727204692.35963: variable 'ansible_shell_type' from source: unknown 43681 1727204692.35967: variable 'ansible_shell_executable' from source: unknown 43681 1727204692.35970: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204692.35976: variable 'ansible_pipelining' from source: unknown 43681 1727204692.35980: variable 'ansible_timeout' from source: unknown 43681 1727204692.35985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204692.36170: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204692.36180: variable 'omit' from source: magic vars 43681 1727204692.36187: starting attempt loop 43681 1727204692.36191: running the handler 43681 1727204692.36207: variable 'ansible_facts' from source: unknown 43681 1727204692.36228: _low_level_execute_command(): starting 43681 1727204692.36236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204692.36857: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.36861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204692.36864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.36920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204692.36924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.36980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.38736: stdout chunk (state=3): >>>/root <<< 43681 1727204692.38844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.38920: stderr chunk (state=3): >>><<< 43681 1727204692.38924: stdout chunk (state=3): >>><<< 43681 1727204692.38928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204692.38935: _low_level_execute_command(): starting 43681 1727204692.38941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505 `" && echo ansible-tmp-1727204692.3892121-43795-149885710218505="` echo /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505 `" ) && sleep 0' 43681 1727204692.39397: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.39402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.39405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.39407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.39465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204692.39472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.39509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.41506: stdout chunk (state=3): >>>ansible-tmp-1727204692.3892121-43795-149885710218505=/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505 <<< 43681 1727204692.41626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.41675: stderr chunk (state=3): >>><<< 43681 1727204692.41679: stdout chunk (state=3): >>><<< 43681 1727204692.41691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204692.3892121-43795-149885710218505=/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204692.41721: variable 'ansible_module_compression' from source: unknown 43681 1727204692.41768: ANSIBALLZ: Using generic lock for ansible.legacy.setup 43681 1727204692.41773: ANSIBALLZ: Acquiring lock 43681 1727204692.41776: ANSIBALLZ: Lock acquired: 140156138759584 43681 1727204692.41779: ANSIBALLZ: Creating module 43681 1727204692.68598: ANSIBALLZ: Writing module into payload 43681 1727204692.68643: ANSIBALLZ: Writing module 43681 1727204692.68685: ANSIBALLZ: Renaming module 43681 1727204692.68703: ANSIBALLZ: Done creating module 43681 1727204692.68764: variable 'ansible_facts' from source: unknown 43681 1727204692.68779: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204692.68798: _low_level_execute_command(): starting 43681 1727204692.68811: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 43681 1727204692.69619: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.69696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204692.69743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.69819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.71585: stdout chunk (state=3): >>>PLATFORM <<< 43681 1727204692.71666: stdout chunk (state=3): >>>Linux <<< 43681 1727204692.71696: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 43681 1727204692.71700: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 43681 1727204692.71840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.71893: stderr chunk (state=3): >>><<< 43681 1727204692.71896: stdout chunk (state=3): >>><<< 43681 1727204692.71913: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204692.71926 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 43681 1727204692.71965: _low_level_execute_command(): starting 43681 1727204692.71969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 43681 1727204692.72063: Sending initial data 43681 1727204692.72066: Sent initial data (1181 bytes) 43681 1727204692.72431: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.72434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204692.72437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204692.72439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.72492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204692.72496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.72534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.76202: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 43681 1727204692.76620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.76670: stderr chunk (state=3): >>><<< 43681 1727204692.76674: stdout chunk (state=3): >>><<< 43681 1727204692.76688: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204692.76766: variable 'ansible_facts' from source: unknown 43681 1727204692.76769: variable 'ansible_facts' from source: unknown 43681 1727204692.76779: variable 'ansible_module_compression' from source: unknown 43681 1727204692.76813: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204692.76838: variable 'ansible_facts' from source: unknown 43681 1727204692.76953: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py 43681 1727204692.77072: Sending initial data 43681 1727204692.77076: Sent initial data (154 bytes) 43681 1727204692.77539: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204692.77542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204692.77545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204692.77549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.77596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204692.77599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.77636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.79275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 43681 1727204692.79280: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204692.79310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204692.79347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp68x8hvku /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py <<< 43681 1727204692.79350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py" <<< 43681 1727204692.79381: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp68x8hvku" to remote "/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py" <<< 43681 1727204692.79388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py" <<< 43681 1727204692.81019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.81094: stderr chunk (state=3): >>><<< 43681 1727204692.81098: stdout chunk (state=3): >>><<< 43681 1727204692.81126: done transferring module to remote 43681 1727204692.81142: _low_level_execute_command(): starting 43681 1727204692.81149: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/ /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py && sleep 0' 43681 1727204692.81642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204692.81648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204692.81655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.81657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.81659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.81712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204692.81715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.81757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.83592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204692.83649: stderr chunk (state=3): >>><<< 43681 1727204692.83652: stdout chunk (state=3): >>><<< 43681 1727204692.83667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204692.83670: _low_level_execute_command(): starting 43681 1727204692.83676: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/AnsiballZ_setup.py && sleep 0' 43681 1727204692.84164: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204692.84167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204692.84170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.84172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204692.84174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204692.84220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204692.84223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204692.84278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204692.86433: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 43681 1727204692.86467: stdout chunk (state=3): >>>import _imp # builtin <<< 43681 1727204692.86503: stdout chunk (state=3): >>>import '_thread' # <<< 43681 1727204692.86509: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 43681 1727204692.86583: stdout chunk (state=3): >>>import '_io' # <<< 43681 1727204692.86590: stdout chunk (state=3): >>>import 'marshal' # <<< 43681 1727204692.86624: stdout chunk (state=3): >>>import 'posix' # <<< 43681 1727204692.86661: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 43681 1727204692.86699: stdout chunk (state=3): >>>import 'time' # <<< 43681 1727204692.86703: stdout chunk (state=3): >>>import 'zipimport' # <<< 43681 1727204692.86709: stdout chunk (state=3): >>># installed zipimport hook <<< 43681 1727204692.86760: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 43681 1727204692.86768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.86783: stdout chunk (state=3): >>>import '_codecs' # <<< 43681 1727204692.86790: stdout chunk (state=3): >>> <<< 43681 1727204692.86809: stdout chunk (state=3): >>>import 'codecs' # <<< 43681 1727204692.86853: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 43681 1727204692.86871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 43681 1727204692.86891: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19538b44d0> <<< 43681 1727204692.86897: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953883ad0> <<< 43681 1727204692.86924: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 43681 1727204692.86936: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19538b6a20> <<< 43681 1727204692.86964: stdout chunk (state=3): >>>import '_signal' # <<< 43681 1727204692.86988: stdout chunk (state=3): >>>import '_abc' # <<< 43681 1727204692.86994: stdout chunk (state=3): >>>import 'abc' # <<< 43681 1727204692.87012: stdout chunk (state=3): >>>import 'io' # <<< 43681 1727204692.87048: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 43681 1727204692.87146: stdout chunk (state=3): >>>import '_collections_abc' # <<< 43681 1727204692.87179: stdout chunk (state=3): >>>import 'genericpath' # <<< 43681 1727204692.87187: stdout chunk (state=3): >>>import 'posixpath' # <<< 43681 1727204692.87206: stdout chunk (state=3): >>>import 'os' # <<< 43681 1727204692.87226: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 43681 1727204692.87244: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 43681 1727204692.87269: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 43681 1727204692.87281: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 43681 1727204692.87284: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 43681 1727204692.87313: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 43681 1727204692.87341: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536690a0> <<< 43681 1727204692.87412: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 43681 1727204692.87421: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.87427: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953669fd0> <<< 43681 1727204692.87459: stdout chunk (state=3): >>>import 'site' # <<< 43681 1727204692.87492: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 43681 1727204692.87897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 43681 1727204692.87903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 43681 1727204692.87929: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 43681 1727204692.87937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.87964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 43681 1727204692.88002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 43681 1727204692.88024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 43681 1727204692.88061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 43681 1727204692.88067: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a7da0> <<< 43681 1727204692.88086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 43681 1727204692.88107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 43681 1727204692.88131: stdout chunk (state=3): >>>import '_operator' # <<< 43681 1727204692.88138: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a7fe0> <<< 43681 1727204692.88156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 43681 1727204692.88188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 43681 1727204692.88211: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 43681 1727204692.88264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.88283: stdout chunk (state=3): >>>import 'itertools' # <<< 43681 1727204692.88309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536df7a0> <<< 43681 1727204692.88338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 43681 1727204692.88352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 43681 1727204692.88363: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536dfe30> <<< 43681 1727204692.88368: stdout chunk (state=3): >>>import '_collections' # <<< 43681 1727204692.88426: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536bfa40> <<< 43681 1727204692.88436: stdout chunk (state=3): >>>import '_functools' # <<< 43681 1727204692.88467: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536bd190> <<< 43681 1727204692.88564: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a4f50> <<< 43681 1727204692.88597: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 43681 1727204692.88612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 43681 1727204692.88630: stdout chunk (state=3): >>>import '_sre' # <<< 43681 1727204692.88650: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 43681 1727204692.88677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 43681 1727204692.88704: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 43681 1727204692.88707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 43681 1727204692.88738: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953703680> <<< 43681 1727204692.88759: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537022a0> <<< 43681 1727204692.88792: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 43681 1727204692.88798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536be180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953700b30> <<< 43681 1727204692.88858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 43681 1727204692.88867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953734680> <<< 43681 1727204692.88869: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a41d0> <<< 43681 1727204692.88897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 43681 1727204692.88936: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.88940: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953734b30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537349e0> <<< 43681 1727204692.88976: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.88988: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953734da0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a2d20> <<< 43681 1727204692.89026: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.89055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 43681 1727204692.89077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 43681 1727204692.89103: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953735460> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953735130> <<< 43681 1727204692.89109: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 43681 1727204692.89148: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 43681 1727204692.89152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 43681 1727204692.89176: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953736360> <<< 43681 1727204692.89180: stdout chunk (state=3): >>>import 'importlib.util' # <<< 43681 1727204692.89188: stdout chunk (state=3): >>>import 'runpy' # <<< 43681 1727204692.89215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 43681 1727204692.89247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 43681 1727204692.89280: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 43681 1727204692.89288: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953750590> <<< 43681 1727204692.89303: stdout chunk (state=3): >>>import 'errno' # <<< 43681 1727204692.89334: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89342: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953751cd0> <<< 43681 1727204692.89365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 43681 1727204692.89383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 43681 1727204692.89400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 43681 1727204692.89411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 43681 1727204692.89420: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953752bd0> <<< 43681 1727204692.89455: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953753230> <<< 43681 1727204692.89474: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953752120> <<< 43681 1727204692.89497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 43681 1727204692.89512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 43681 1727204692.89542: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89563: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953753c80> <<< 43681 1727204692.89569: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537533e0> <<< 43681 1727204692.89616: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537363c0> <<< 43681 1727204692.89634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 43681 1727204692.89663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 43681 1727204692.89685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 43681 1727204692.89710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 43681 1727204692.89739: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89745: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953487bc0> <<< 43681 1727204692.89768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 43681 1727204692.89774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 43681 1727204692.89803: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89811: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b0680> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b03e0> <<< 43681 1727204692.89833: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89839: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b06b0> <<< 43681 1727204692.89865: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.89871: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b0890> <<< 43681 1727204692.89892: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953485d60> <<< 43681 1727204692.89913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 43681 1727204692.90017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 43681 1727204692.90039: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 43681 1727204692.90053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 43681 1727204692.90060: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b1f70> <<< 43681 1727204692.90091: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b0bf0> <<< 43681 1727204692.90108: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953736ab0> <<< 43681 1727204692.90137: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 43681 1727204692.90188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.90211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 43681 1727204692.90256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 43681 1727204692.90292: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534de330> <<< 43681 1727204692.90334: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 43681 1727204692.90360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.90375: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 43681 1727204692.90399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 43681 1727204692.90447: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f6480> <<< 43681 1727204692.90476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 43681 1727204692.90511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 43681 1727204692.90572: stdout chunk (state=3): >>>import 'ntpath' # <<< 43681 1727204692.90601: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953533230> <<< 43681 1727204692.90621: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 43681 1727204692.90661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 43681 1727204692.90688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 43681 1727204692.90730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 43681 1727204692.90821: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19535559d0> <<< 43681 1727204692.90904: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953533350> <<< 43681 1727204692.90945: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f7110> <<< 43681 1727204692.90974: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953330380> <<< 43681 1727204692.91000: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b2ed0> <<< 43681 1727204692.91166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 43681 1727204692.91191: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f19534f55e0> <<< 43681 1727204692.91367: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_f1hbn4wr/ansible_ansible.legacy.setup_payload.zip' <<< 43681 1727204692.91373: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.91522: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.91555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 43681 1727204692.91568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 43681 1727204692.91609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 43681 1727204692.91688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 43681 1727204692.91724: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953396000> <<< 43681 1727204692.91735: stdout chunk (state=3): >>>import '_typing' # <<< 43681 1727204692.91933: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195336cef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953333f80> <<< 43681 1727204692.91949: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.91975: stdout chunk (state=3): >>>import 'ansible' # <<< 43681 1727204692.91998: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.92012: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.92024: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.92037: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 43681 1727204692.92053: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.93610: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.94910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 43681 1727204692.94917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195336fd10> <<< 43681 1727204692.94940: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 43681 1727204692.94946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.94978: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 43681 1727204692.94996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 43681 1727204692.95013: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 43681 1727204692.95043: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.95049: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533c99a0> <<< 43681 1727204692.95093: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9730> <<< 43681 1727204692.95121: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9040> <<< 43681 1727204692.95144: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 43681 1727204692.95154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 43681 1727204692.95203: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953396a80> <<< 43681 1727204692.95213: stdout chunk (state=3): >>>import 'atexit' # <<< 43681 1727204692.95238: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.95245: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533ca6c0> <<< 43681 1727204692.95268: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.95274: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533ca900> <<< 43681 1727204692.95295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 43681 1727204692.95341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 43681 1727204692.95356: stdout chunk (state=3): >>>import '_locale' # <<< 43681 1727204692.95405: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533cae40> <<< 43681 1727204692.95414: stdout chunk (state=3): >>>import 'pwd' # <<< 43681 1727204692.95436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 43681 1727204692.95462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 43681 1727204692.95505: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195322cbc0> <<< 43681 1727204692.95533: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.95540: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195322e7e0> <<< 43681 1727204692.95563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 43681 1727204692.95578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 43681 1727204692.95614: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195322f1a0> <<< 43681 1727204692.95638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 43681 1727204692.95666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 43681 1727204692.95687: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953230380> <<< 43681 1727204692.95704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 43681 1727204692.95742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 43681 1727204692.95769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 43681 1727204692.95773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 43681 1727204692.95827: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953232e10> <<< 43681 1727204692.95867: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953232f30> <<< 43681 1727204692.95899: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953231100> <<< 43681 1727204692.95911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 43681 1727204692.95939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 43681 1727204692.95965: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 43681 1727204692.95977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 43681 1727204692.95987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 43681 1727204692.96019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 43681 1727204692.96040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 43681 1727204692.96049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 43681 1727204692.96061: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953236e10> <<< 43681 1727204692.96073: stdout chunk (state=3): >>>import '_tokenize' # <<< 43681 1727204692.96141: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532358e0> <<< 43681 1727204692.96151: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953235640> <<< 43681 1727204692.96171: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 43681 1727204692.96178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 43681 1727204692.96254: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953237ad0> <<< 43681 1727204692.96289: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532315e0> <<< 43681 1727204692.96314: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.96324: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195327afc0> <<< 43681 1727204692.96350: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 43681 1727204692.96356: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195327b110> <<< 43681 1727204692.96372: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 43681 1727204692.96395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 43681 1727204692.96420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 43681 1727204692.96424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 43681 1727204692.96457: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.96463: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953280ce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953280aa0> <<< 43681 1727204692.96482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 43681 1727204692.96601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 43681 1727204692.96652: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.96658: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953283260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532813d0> <<< 43681 1727204692.96688: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 43681 1727204692.96731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.96758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 43681 1727204692.96772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 43681 1727204692.96781: stdout chunk (state=3): >>>import '_string' # <<< 43681 1727204692.96831: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328a9f0> <<< 43681 1727204692.96986: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532833b0> <<< 43681 1727204692.97061: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97067: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b7d0> <<< 43681 1727204692.97102: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b9e0> <<< 43681 1727204692.97152: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97163: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b3b0> <<< 43681 1727204692.97172: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195327b410> <<< 43681 1727204692.97199: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 43681 1727204692.97227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 43681 1727204692.97250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 43681 1727204692.97285: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97313: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97321: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328f440> <<< 43681 1727204692.97514: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97528: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19532902f0> <<< 43681 1727204692.97537: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328dbb0> <<< 43681 1727204692.97567: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.97574: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328ef30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328d790> <<< 43681 1727204692.97603: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97614: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97629: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 43681 1727204692.97639: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97742: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97851: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97864: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 43681 1727204692.97888: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.97909: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 43681 1727204692.97925: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.98070: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.98212: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.98902: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.99583: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 43681 1727204692.99609: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 43681 1727204692.99620: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 43681 1727204692.99638: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 43681 1727204692.99666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204692.99726: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204692.99731: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953118590> <<< 43681 1727204692.99841: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 43681 1727204692.99845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 43681 1727204692.99867: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953119a30> <<< 43681 1727204692.99886: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953293770> <<< 43681 1727204692.99936: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 43681 1727204692.99951: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204692.99972: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.00001: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 43681 1727204693.00009: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.00192: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.00380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 43681 1727204693.00390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 43681 1727204693.00400: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953119ac0> <<< 43681 1727204693.00416: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.00993: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01555: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01639: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01737: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 43681 1727204693.01743: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01788: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01832: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 43681 1727204693.01843: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.01927: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02044: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 43681 1727204693.02063: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02075: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 43681 1727204693.02101: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02143: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02192: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 43681 1727204693.02199: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02483: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.02767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 43681 1727204693.02839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 43681 1727204693.02858: stdout chunk (state=3): >>>import '_ast' # <<< 43681 1727204693.02948: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195311be30> <<< 43681 1727204693.02962: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03044: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03136: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 43681 1727204693.03144: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 43681 1727204693.03161: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 43681 1727204693.03188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 43681 1727204693.03196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 43681 1727204693.03276: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.03407: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953122030> <<< 43681 1727204693.03464: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.03467: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953122900> <<< 43681 1727204693.03481: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953290170> <<< 43681 1727204693.03499: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03541: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03590: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 43681 1727204693.03641: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03690: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03752: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.03827: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 43681 1727204693.03875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.03973: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 43681 1727204693.03981: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953121790> <<< 43681 1727204693.04019: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953122a20> <<< 43681 1727204693.04050: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 43681 1727204693.04068: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04139: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04209: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04238: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04288: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 43681 1727204693.04297: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.04313: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 43681 1727204693.04337: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 43681 1727204693.04358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 43681 1727204693.04426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 43681 1727204693.04439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 43681 1727204693.04463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 43681 1727204693.04525: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bac90> <<< 43681 1727204693.04574: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312ca10> <<< 43681 1727204693.04657: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312aa50> <<< 43681 1727204693.04668: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312a8a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 43681 1727204693.04676: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04712: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04740: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 43681 1727204693.04746: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 43681 1727204693.04808: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 43681 1727204693.04828: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04835: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 43681 1727204693.04854: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04924: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.04988: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05010: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05032: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05085: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05127: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05169: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 43681 1727204693.05224: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05308: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05392: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05418: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05455: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 43681 1727204693.05465: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05667: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05862: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05910: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.05969: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 43681 1727204693.05975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.05995: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 43681 1727204693.06015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 43681 1727204693.06033: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 43681 1727204693.06063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 43681 1727204693.06085: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bd640> <<< 43681 1727204693.06115: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 43681 1727204693.06123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 43681 1727204693.06147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 43681 1727204693.06193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 43681 1727204693.06221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 43681 1727204693.06228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 43681 1727204693.06246: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270c2f0> <<< 43681 1727204693.06281: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.06296: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270c8c0> <<< 43681 1727204693.06351: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195319d3a0> <<< 43681 1727204693.06370: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195319c950> <<< 43681 1727204693.06409: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bfc80> <<< 43681 1727204693.06416: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bf5c0> <<< 43681 1727204693.06439: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 43681 1727204693.06510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 43681 1727204693.06535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 43681 1727204693.06544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 43681 1727204693.06571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 43681 1727204693.06579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 43681 1727204693.06613: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270f650> <<< 43681 1727204693.06621: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270ef00> <<< 43681 1727204693.06646: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.06651: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270f0e0> <<< 43681 1727204693.06667: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270e360> <<< 43681 1727204693.06690: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 43681 1727204693.06809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 43681 1727204693.06823: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270f830> <<< 43681 1727204693.06840: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 43681 1727204693.06877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 43681 1727204693.06907: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.06914: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1952776330> <<< 43681 1727204693.06941: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952774350> <<< 43681 1727204693.06971: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bf650> import 'ansible.module_utils.facts.timeout' # <<< 43681 1727204693.06999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 43681 1727204693.07019: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07035: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 43681 1727204693.07050: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07113: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 43681 1727204693.07193: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07252: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 43681 1727204693.07322: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07335: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 43681 1727204693.07352: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07391: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 43681 1727204693.07428: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07492: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 43681 1727204693.07549: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07596: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 43681 1727204693.07652: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07719: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07783: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07846: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.07911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 43681 1727204693.07923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 43681 1727204693.07929: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.08472: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.08961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 43681 1727204693.08978: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09034: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09099: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09130: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09167: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 43681 1727204693.09193: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09220: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 43681 1727204693.09262: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09327: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 43681 1727204693.09405: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09433: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 43681 1727204693.09479: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09514: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 43681 1727204693.09559: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09641: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.09742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 43681 1727204693.09749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 43681 1727204693.09776: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952777bf0> <<< 43681 1727204693.09799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 43681 1727204693.09832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 43681 1727204693.09966: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952776ff0> <<< 43681 1727204693.09970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 43681 1727204693.09984: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10049: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 43681 1727204693.10139: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10233: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 43681 1727204693.10349: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10422: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 43681 1727204693.10520: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10559: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.10614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 43681 1727204693.10662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 43681 1727204693.10742: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.10807: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19527a65a0> <<< 43681 1727204693.11023: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952793440> import 'ansible.module_utils.facts.system.python' # <<< 43681 1727204693.11041: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11101: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 43681 1727204693.11173: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11271: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11360: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11488: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11648: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 43681 1727204693.11660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 43681 1727204693.11667: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11715: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 43681 1727204693.11766: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11808: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 43681 1727204693.11905: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.11926: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1952541df0> <<< 43681 1727204693.11942: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952541be0> import 'ansible.module_utils.facts.system.user' # <<< 43681 1727204693.11961: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.11967: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 43681 1727204693.11993: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12032: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 43681 1727204693.12088: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12267: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 43681 1727204693.12450: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12560: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12675: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12716: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 43681 1727204693.12780: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 43681 1727204693.12796: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12809: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12836: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.12995: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.13160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 43681 1727204693.13171: stdout chunk (state=3): >>> <<< 43681 1727204693.13177: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.13313: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.13445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 43681 1727204693.13466: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.13499: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.13541: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.14174: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.14761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 43681 1727204693.14778: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.14895: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 43681 1727204693.15023: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15136: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15249: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 43681 1727204693.15259: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15428: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 43681 1727204693.15630: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15636: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 43681 1727204693.15659: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15703: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 43681 1727204693.15760: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15871: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.15978: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16215: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 43681 1727204693.16456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 43681 1727204693.16465: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16507: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 43681 1727204693.16558: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16581: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 43681 1727204693.16619: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16700: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 43681 1727204693.16786: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16811: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 43681 1727204693.16847: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16917: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.16975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 43681 1727204693.16996: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17051: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 43681 1727204693.17130: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17437: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 43681 1727204693.17735: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17804: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 43681 1727204693.17876: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17919: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.17962: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 43681 1727204693.17970: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18004: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18039: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 43681 1727204693.18051: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18086: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18127: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 43681 1727204693.18137: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18224: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 43681 1727204693.18335: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18342: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 43681 1727204693.18367: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18410: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 43681 1727204693.18469: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18499: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18517: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18572: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18624: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18706: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 43681 1727204693.18791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 43681 1727204693.18808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 43681 1727204693.18814: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18863: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.18922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 43681 1727204693.18933: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19155: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 43681 1727204693.19385: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19435: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 43681 1727204693.19499: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19551: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 43681 1727204693.19610: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19703: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 43681 1727204693.19818: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.19913: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.20012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 43681 1727204693.20107: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.21055: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 43681 1727204693.21064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 43681 1727204693.21088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 43681 1727204693.21106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 43681 1727204693.21141: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.21154: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195256afc0> <<< 43681 1727204693.21164: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195256a120> <<< 43681 1727204693.21222: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525687d0> <<< 43681 1727204693.33810: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 43681 1727204693.33817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 43681 1727204693.33836: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b02f0> <<< 43681 1727204693.33860: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 43681 1727204693.33881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 43681 1727204693.33909: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b1790> <<< 43681 1727204693.33962: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 43681 1727204693.33974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.34001: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 43681 1727204693.34010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 43681 1727204693.34038: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b3b00> <<< 43681 1727204693.34059: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b2ba0> <<< 43681 1727204693.34302: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 43681 1727204693.58351: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6D<<< 43681 1727204693.58389: stdout chunk (state=3): >>>IN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 1.0390625, "5m": 0.8974609375, "15m": 0.5546875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "53", "epoch": "1727204693", "epoch_int": "1727204693", "date": "2024-09-24", "time": "15:04:53", "iso8601_micro": "2024-09-24T19:04:53.215937Z", "iso8601": "2024-09-24T19:04:53Z", "iso8601_basic": "20240924T150453215937", "iso8601_basic_short": "20240924T150453", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansib<<< 43681 1727204693.58431: stdout chunk (state=3): >>>le_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1197, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148439552, "block_size": 4096, "block_total": 64479564, "block_available": 61315537, "block_used": 3164027, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "<<< 43681 1727204693.58439: stdout chunk (state=3): >>>10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204693.59036: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 43681 1727204693.59050: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 43681 1727204693.59074: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 43681 1727204693.59094: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os <<< 43681 1727204693.59098: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 43681 1727204693.59126: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 43681 1727204693.59140: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 43681 1727204693.59149: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 43681 1727204693.59177: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex <<< 43681 1727204693.59201: stdout chunk (state=3): >>># cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing <<< 43681 1727204693.59232: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro <<< 43681 1727204693.59270: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps <<< 43681 1727204693.59279: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux <<< 43681 1727204693.59323: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector <<< 43681 1727204693.59327: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd <<< 43681 1727204693.59347: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 43681 1727204693.59689: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 43681 1727204693.59696: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 43681 1727204693.59724: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 43681 1727204693.59745: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 43681 1727204693.59767: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 43681 1727204693.59816: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 43681 1727204693.59838: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 43681 1727204693.59858: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 43681 1727204693.59875: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 43681 1727204693.59894: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 43681 1727204693.59907: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog <<< 43681 1727204693.59919: stdout chunk (state=3): >>># destroy uuid <<< 43681 1727204693.59951: stdout chunk (state=3): >>># destroy _hashlib <<< 43681 1727204693.59971: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil <<< 43681 1727204693.59975: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 43681 1727204693.60028: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 43681 1727204693.60036: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 43681 1727204693.60061: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 43681 1727204693.60070: stdout chunk (state=3): >>># destroy _heapq # destroy _queue <<< 43681 1727204693.60094: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 43681 1727204693.60101: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 43681 1727204693.60121: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 43681 1727204693.60147: stdout chunk (state=3): >>># destroy _ssl <<< 43681 1727204693.60163: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 43681 1727204693.60175: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 43681 1727204693.60203: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 43681 1727204693.60232: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno <<< 43681 1727204693.60246: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 43681 1727204693.60313: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 43681 1727204693.60332: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 43681 1727204693.60340: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 43681 1727204693.60366: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 43681 1727204693.60377: stdout chunk (state=3): >>># cleanup[3] wiping weakref <<< 43681 1727204693.60411: stdout chunk (state=3): >>># cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 43681 1727204693.60427: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 43681 1727204693.60449: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 43681 1727204693.60463: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 43681 1727204693.60492: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 43681 1727204693.60505: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon <<< 43681 1727204693.60513: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 43681 1727204693.60649: stdout chunk (state=3): >>># destroy sys.monitoring <<< 43681 1727204693.60657: stdout chunk (state=3): >>># destroy _socket <<< 43681 1727204693.60671: stdout chunk (state=3): >>># destroy _collections <<< 43681 1727204693.60706: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 43681 1727204693.60713: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 43681 1727204693.60740: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 43681 1727204693.60781: stdout chunk (state=3): >>># destroy _typing <<< 43681 1727204693.60786: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request <<< 43681 1727204693.60797: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 43681 1727204693.60805: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 43681 1727204693.60824: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 43681 1727204693.60918: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig<<< 43681 1727204693.60934: stdout chunk (state=3): >>> # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 43681 1727204693.60962: stdout chunk (state=3): >>># destroy _random <<< 43681 1727204693.60969: stdout chunk (state=3): >>># destroy _weakref <<< 43681 1727204693.60997: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre <<< 43681 1727204693.61020: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools <<< 43681 1727204693.61038: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 43681 1727204693.61045: stdout chunk (state=3): >>># clear sys.audit hooks <<< 43681 1727204693.61496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204693.61562: stderr chunk (state=3): >>><<< 43681 1727204693.61566: stdout chunk (state=3): >>><<< 43681 1727204693.61686: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19538b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953883ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19538b6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536690a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953669fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a7da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536df7a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536dfe30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536bfa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536bd190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a4f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953703680> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537022a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536be180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953700b30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953734680> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a41d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953734b30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537349e0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953734da0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19536a2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953735460> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953735130> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953736360> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953750590> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953751cd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953752bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953753230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953752120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953753c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537533e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19537363c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953487bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b0680> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b03e0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b06b0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19534b0890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953485d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b1f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b0bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953736ab0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534de330> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f6480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953533230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19535559d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953533350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953330380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534f54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19534b2ed0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f19534f55e0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_f1hbn4wr/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953396000> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195336cef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953333f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195336fd10> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533c99a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533c9790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953396a80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533ca6c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19533ca900> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19533cae40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195322cbc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195322e7e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195322f1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953230380> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953232e10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953232f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953231100> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953236e10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532358e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953235640> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953237ad0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532315e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195327afc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195327b110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953280ce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953280aa0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953283260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532813d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328a9f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19532833b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b7d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328b3b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195327b410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328f440> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19532902f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328dbb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195328ef30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195328d790> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953118590> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953119a30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953293770> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953119ac0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195311be30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953122030> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953122900> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953290170> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1953121790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1953122a20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bac90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312ca10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312aa50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195312a8a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bd640> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270c2f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270c8c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195319d3a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195319c950> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bfc80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bf5c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270f650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270ef00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195270f0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270e360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195270f830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1952776330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952774350> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19531bf650> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952777bf0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952776ff0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19527a65a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952793440> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1952541df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1952541be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f195256afc0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f195256a120> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525687d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b02f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b1790> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b3b00> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19525b2ba0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 1.0390625, "5m": 0.8974609375, "15m": 0.5546875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "53", "epoch": "1727204693", "epoch_int": "1727204693", "date": "2024-09-24", "time": "15:04:53", "iso8601_micro": "2024-09-24T19:04:53.215937Z", "iso8601": "2024-09-24T19:04:53Z", "iso8601_basic": "20240924T150453215937", "iso8601_basic_short": "20240924T150453", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1197, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148439552, "block_size": 4096, "block_total": 64479564, "block_available": 61315537, "block_used": 3164027, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 43681 1727204693.62606: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204693.62610: _low_level_execute_command(): starting 43681 1727204693.62620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204692.3892121-43795-149885710218505/ > /dev/null 2>&1 && sleep 0' 43681 1727204693.62920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204693.62923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204693.62926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.62928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204693.62930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.62981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204693.62984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.63028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.64949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204693.65003: stderr chunk (state=3): >>><<< 43681 1727204693.65007: stdout chunk (state=3): >>><<< 43681 1727204693.65023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204693.65036: handler run complete 43681 1727204693.65156: variable 'ansible_facts' from source: unknown 43681 1727204693.65242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.65561: variable 'ansible_facts' from source: unknown 43681 1727204693.65657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.66044: attempt loop complete, returning result 43681 1727204693.66048: _execute() done 43681 1727204693.66050: dumping result to json 43681 1727204693.66052: done dumping result, returning 43681 1727204693.66054: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-0000000000af] 43681 1727204693.66057: sending task result for task 12b410aa-8751-9e86-7728-0000000000af ok: [managed-node3] 43681 1727204693.66767: done sending task result for task 12b410aa-8751-9e86-7728-0000000000af 43681 1727204693.66999: WORKER PROCESS EXITING 43681 1727204693.66992: no more pending results, returning what we have 43681 1727204693.67004: results queue empty 43681 1727204693.67005: checking for any_errors_fatal 43681 1727204693.67007: done checking for any_errors_fatal 43681 1727204693.67008: checking for max_fail_percentage 43681 1727204693.67010: done checking for max_fail_percentage 43681 1727204693.67011: checking to see if all hosts have failed and the running result is not ok 43681 1727204693.67012: done checking to see if all hosts have failed 43681 1727204693.67013: getting the remaining hosts for this loop 43681 1727204693.67015: done getting the remaining hosts for this loop 43681 1727204693.67022: getting the next task for host managed-node3 43681 1727204693.67030: done getting next task for host managed-node3 43681 1727204693.67033: ^ task is: TASK: meta (flush_handlers) 43681 1727204693.67035: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204693.67040: getting variables 43681 1727204693.67042: in VariableManager get_vars() 43681 1727204693.67077: Calling all_inventory to load vars for managed-node3 43681 1727204693.67081: Calling groups_inventory to load vars for managed-node3 43681 1727204693.67086: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204693.67100: Calling all_plugins_play to load vars for managed-node3 43681 1727204693.67104: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204693.67109: Calling groups_plugins_play to load vars for managed-node3 43681 1727204693.67419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.67742: done with get_vars() 43681 1727204693.67756: done getting variables 43681 1727204693.67854: in VariableManager get_vars() 43681 1727204693.67867: Calling all_inventory to load vars for managed-node3 43681 1727204693.67870: Calling groups_inventory to load vars for managed-node3 43681 1727204693.67873: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204693.67879: Calling all_plugins_play to load vars for managed-node3 43681 1727204693.67883: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204693.67886: Calling groups_plugins_play to load vars for managed-node3 43681 1727204693.68145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.68476: done with get_vars() 43681 1727204693.68504: done queuing things up, now waiting for results queue to drain 43681 1727204693.68508: results queue empty 43681 1727204693.68509: checking for any_errors_fatal 43681 1727204693.68512: done checking for any_errors_fatal 43681 1727204693.68513: checking for max_fail_percentage 43681 1727204693.68515: done checking for max_fail_percentage 43681 1727204693.68518: checking to see if all hosts have failed and the running result is not ok 43681 1727204693.68524: done checking to see if all hosts have failed 43681 1727204693.68525: getting the remaining hosts for this loop 43681 1727204693.68527: done getting the remaining hosts for this loop 43681 1727204693.68530: getting the next task for host managed-node3 43681 1727204693.68537: done getting next task for host managed-node3 43681 1727204693.68540: ^ task is: TASK: Include the task 'el_repo_setup.yml' 43681 1727204693.68542: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204693.68545: getting variables 43681 1727204693.68546: in VariableManager get_vars() 43681 1727204693.68557: Calling all_inventory to load vars for managed-node3 43681 1727204693.68560: Calling groups_inventory to load vars for managed-node3 43681 1727204693.68563: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204693.68569: Calling all_plugins_play to load vars for managed-node3 43681 1727204693.68572: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204693.68576: Calling groups_plugins_play to load vars for managed-node3 43681 1727204693.68807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.69148: done with get_vars() 43681 1727204693.69159: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Tuesday 24 September 2024 15:04:53 -0400 (0:00:01.344) 0:00:01.359 ***** 43681 1727204693.69266: entering _queue_task() for managed-node3/include_tasks 43681 1727204693.69269: Creating lock for include_tasks 43681 1727204693.69736: worker is 1 (out of 1 available) 43681 1727204693.69750: exiting _queue_task() for managed-node3/include_tasks 43681 1727204693.69763: done queuing things up, now waiting for results queue to drain 43681 1727204693.69765: waiting for pending results... 43681 1727204693.70243: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 43681 1727204693.70312: in run() - task 12b410aa-8751-9e86-7728-000000000006 43681 1727204693.70340: variable 'ansible_search_path' from source: unknown 43681 1727204693.70396: calling self._execute() 43681 1727204693.70498: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204693.70514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204693.70535: variable 'omit' from source: magic vars 43681 1727204693.70646: _execute() done 43681 1727204693.70651: dumping result to json 43681 1727204693.70654: done dumping result, returning 43681 1727204693.70661: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-9e86-7728-000000000006] 43681 1727204693.70667: sending task result for task 12b410aa-8751-9e86-7728-000000000006 43681 1727204693.70769: done sending task result for task 12b410aa-8751-9e86-7728-000000000006 43681 1727204693.70772: WORKER PROCESS EXITING 43681 1727204693.70839: no more pending results, returning what we have 43681 1727204693.70844: in VariableManager get_vars() 43681 1727204693.70872: Calling all_inventory to load vars for managed-node3 43681 1727204693.70875: Calling groups_inventory to load vars for managed-node3 43681 1727204693.70878: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204693.70891: Calling all_plugins_play to load vars for managed-node3 43681 1727204693.70895: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204693.70899: Calling groups_plugins_play to load vars for managed-node3 43681 1727204693.71086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.71258: done with get_vars() 43681 1727204693.71269: variable 'ansible_search_path' from source: unknown 43681 1727204693.71280: we have included files to process 43681 1727204693.71281: generating all_blocks data 43681 1727204693.71282: done generating all_blocks data 43681 1727204693.71282: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 43681 1727204693.71283: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 43681 1727204693.71285: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 43681 1727204693.71927: in VariableManager get_vars() 43681 1727204693.71946: done with get_vars() 43681 1727204693.71962: done processing included file 43681 1727204693.71964: iterating over new_blocks loaded from include file 43681 1727204693.71966: in VariableManager get_vars() 43681 1727204693.71977: done with get_vars() 43681 1727204693.71978: filtering new block on tags 43681 1727204693.71999: done filtering new block on tags 43681 1727204693.72010: in VariableManager get_vars() 43681 1727204693.72024: done with get_vars() 43681 1727204693.72026: filtering new block on tags 43681 1727204693.72045: done filtering new block on tags 43681 1727204693.72048: in VariableManager get_vars() 43681 1727204693.72083: done with get_vars() 43681 1727204693.72085: filtering new block on tags 43681 1727204693.72104: done filtering new block on tags 43681 1727204693.72107: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 43681 1727204693.72118: extending task lists for all hosts with included blocks 43681 1727204693.72180: done extending task lists 43681 1727204693.72181: done processing included files 43681 1727204693.72182: results queue empty 43681 1727204693.72183: checking for any_errors_fatal 43681 1727204693.72185: done checking for any_errors_fatal 43681 1727204693.72186: checking for max_fail_percentage 43681 1727204693.72187: done checking for max_fail_percentage 43681 1727204693.72188: checking to see if all hosts have failed and the running result is not ok 43681 1727204693.72191: done checking to see if all hosts have failed 43681 1727204693.72192: getting the remaining hosts for this loop 43681 1727204693.72193: done getting the remaining hosts for this loop 43681 1727204693.72196: getting the next task for host managed-node3 43681 1727204693.72201: done getting next task for host managed-node3 43681 1727204693.72203: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 43681 1727204693.72206: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204693.72209: getting variables 43681 1727204693.72210: in VariableManager get_vars() 43681 1727204693.72223: Calling all_inventory to load vars for managed-node3 43681 1727204693.72226: Calling groups_inventory to load vars for managed-node3 43681 1727204693.72229: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204693.72235: Calling all_plugins_play to load vars for managed-node3 43681 1727204693.72239: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204693.72243: Calling groups_plugins_play to load vars for managed-node3 43681 1727204693.72468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204693.72787: done with get_vars() 43681 1727204693.72799: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:04:53 -0400 (0:00:00.036) 0:00:01.395 ***** 43681 1727204693.72891: entering _queue_task() for managed-node3/setup 43681 1727204693.73403: worker is 1 (out of 1 available) 43681 1727204693.73413: exiting _queue_task() for managed-node3/setup 43681 1727204693.73424: done queuing things up, now waiting for results queue to drain 43681 1727204693.73426: waiting for pending results... 43681 1727204693.73515: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 43681 1727204693.73655: in run() - task 12b410aa-8751-9e86-7728-0000000000c0 43681 1727204693.73660: variable 'ansible_search_path' from source: unknown 43681 1727204693.73663: variable 'ansible_search_path' from source: unknown 43681 1727204693.73763: calling self._execute() 43681 1727204693.73774: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204693.73792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204693.73810: variable 'omit' from source: magic vars 43681 1727204693.74496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204693.76165: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204693.76223: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204693.76253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204693.76294: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204693.76322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204693.76389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204693.76421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204693.76441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204693.76595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204693.76599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204693.76718: variable 'ansible_facts' from source: unknown 43681 1727204693.76823: variable 'network_test_required_facts' from source: task vars 43681 1727204693.76869: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 43681 1727204693.76881: variable 'omit' from source: magic vars 43681 1727204693.76933: variable 'omit' from source: magic vars 43681 1727204693.76978: variable 'omit' from source: magic vars 43681 1727204693.77019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204693.77056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204693.77080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204693.77109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204693.77126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204693.77164: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204693.77173: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204693.77181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204693.77303: Set connection var ansible_shell_type to sh 43681 1727204693.77316: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204693.77328: Set connection var ansible_timeout to 10 43681 1727204693.77343: Set connection var ansible_pipelining to False 43681 1727204693.77354: Set connection var ansible_connection to ssh 43681 1727204693.77364: Set connection var ansible_shell_executable to /bin/sh 43681 1727204693.77394: variable 'ansible_shell_executable' from source: unknown 43681 1727204693.77595: variable 'ansible_connection' from source: unknown 43681 1727204693.77599: variable 'ansible_module_compression' from source: unknown 43681 1727204693.77602: variable 'ansible_shell_type' from source: unknown 43681 1727204693.77605: variable 'ansible_shell_executable' from source: unknown 43681 1727204693.77607: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204693.77609: variable 'ansible_pipelining' from source: unknown 43681 1727204693.77611: variable 'ansible_timeout' from source: unknown 43681 1727204693.77613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204693.77616: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204693.77626: variable 'omit' from source: magic vars 43681 1727204693.77637: starting attempt loop 43681 1727204693.77644: running the handler 43681 1727204693.77661: _low_level_execute_command(): starting 43681 1727204693.77674: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204693.78371: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204693.78386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204693.78406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204693.78426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204693.78444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204693.78457: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204693.78473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.78495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204693.78591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204693.78615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.78687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.80446: stdout chunk (state=3): >>>/root <<< 43681 1727204693.80633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204693.80646: stdout chunk (state=3): >>><<< 43681 1727204693.80658: stderr chunk (state=3): >>><<< 43681 1727204693.80695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204693.80723: _low_level_execute_command(): starting 43681 1727204693.80734: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426 `" && echo ansible-tmp-1727204693.8071036-43864-177536372354426="` echo /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426 `" ) && sleep 0' 43681 1727204693.81393: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204693.81407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204693.81423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204693.81449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204693.81467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204693.81479: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204693.81503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.81569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.81616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204693.81635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204693.81656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.81732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.83730: stdout chunk (state=3): >>>ansible-tmp-1727204693.8071036-43864-177536372354426=/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426 <<< 43681 1727204693.83906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204693.83927: stderr chunk (state=3): >>><<< 43681 1727204693.84096: stdout chunk (state=3): >>><<< 43681 1727204693.84100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204693.8071036-43864-177536372354426=/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204693.84102: variable 'ansible_module_compression' from source: unknown 43681 1727204693.84105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204693.84160: variable 'ansible_facts' from source: unknown 43681 1727204693.84371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py 43681 1727204693.84572: Sending initial data 43681 1727204693.84583: Sent initial data (154 bytes) 43681 1727204693.85321: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.85394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204693.85413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204693.85447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.85509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.87143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204693.87220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204693.87275: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpv192erg7 /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py <<< 43681 1727204693.87279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py" <<< 43681 1727204693.87303: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpv192erg7" to remote "/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py" <<< 43681 1727204693.89095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204693.89264: stderr chunk (state=3): >>><<< 43681 1727204693.89268: stdout chunk (state=3): >>><<< 43681 1727204693.89271: done transferring module to remote 43681 1727204693.89273: _low_level_execute_command(): starting 43681 1727204693.89278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/ /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py && sleep 0' 43681 1727204693.89676: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204693.89695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204693.89718: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204693.89830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204693.89852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.89925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.91757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204693.91804: stderr chunk (state=3): >>><<< 43681 1727204693.91808: stdout chunk (state=3): >>><<< 43681 1727204693.91828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204693.91831: _low_level_execute_command(): starting 43681 1727204693.91834: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/AnsiballZ_setup.py && sleep 0' 43681 1727204693.92275: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204693.92279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.92281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204693.92283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204693.92338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204693.92342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204693.92387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204693.94551: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 43681 1727204693.94592: stdout chunk (state=3): >>>import _imp # builtin <<< 43681 1727204693.94622: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 43681 1727204693.94629: stdout chunk (state=3): >>>import '_weakref' # <<< 43681 1727204693.94706: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 43681 1727204693.94742: stdout chunk (state=3): >>>import 'posix' # <<< 43681 1727204693.94779: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 43681 1727204693.94819: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 43681 1727204693.94823: stdout chunk (state=3): >>># installed zipimport hook <<< 43681 1727204693.94875: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 43681 1727204693.94881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.94903: stdout chunk (state=3): >>>import '_codecs' # <<< 43681 1727204693.94929: stdout chunk (state=3): >>>import 'codecs' # <<< 43681 1727204693.94964: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 43681 1727204693.94994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 43681 1727204693.95009: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dead44d0> <<< 43681 1727204693.95024: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17deaa3ad0> <<< 43681 1727204693.95048: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 43681 1727204693.95052: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dead6a20> <<< 43681 1727204693.95080: stdout chunk (state=3): >>>import '_signal' # <<< 43681 1727204693.95107: stdout chunk (state=3): >>>import '_abc' # <<< 43681 1727204693.95114: stdout chunk (state=3): >>>import 'abc' # <<< 43681 1727204693.95135: stdout chunk (state=3): >>>import 'io' # <<< 43681 1727204693.95169: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 43681 1727204693.95265: stdout chunk (state=3): >>>import '_collections_abc' # <<< 43681 1727204693.95298: stdout chunk (state=3): >>>import 'genericpath' # <<< 43681 1727204693.95304: stdout chunk (state=3): >>>import 'posixpath' # <<< 43681 1727204693.95333: stdout chunk (state=3): >>>import 'os' # <<< 43681 1727204693.95360: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 43681 1727204693.95391: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 43681 1727204693.95422: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 43681 1727204693.95426: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 43681 1727204693.95440: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 43681 1727204693.95488: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8c50a0> <<< 43681 1727204693.95537: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.95557: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8c5fd0> <<< 43681 1727204693.95591: stdout chunk (state=3): >>>import 'site' # <<< 43681 1727204693.95621: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 43681 1727204693.96051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 43681 1727204693.96055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 43681 1727204693.96081: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.96096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 43681 1727204693.96166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 43681 1727204693.96170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 43681 1727204693.96222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de903e90> <<< 43681 1727204693.96226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 43681 1727204693.96254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 43681 1727204693.96286: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de903f50> <<< 43681 1727204693.96306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 43681 1727204693.96325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 43681 1727204693.96342: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 43681 1727204693.96394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.96427: stdout chunk (state=3): >>>import 'itertools' # <<< 43681 1727204693.96445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de93b860> <<< 43681 1727204693.96477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 43681 1727204693.96503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de93bef0> import '_collections' # <<< 43681 1727204693.96556: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de91bb60> <<< 43681 1727204693.96580: stdout chunk (state=3): >>>import '_functools' # <<< 43681 1727204693.96606: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de919280> <<< 43681 1727204693.96699: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de901040> <<< 43681 1727204693.96742: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 43681 1727204693.96774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 43681 1727204693.96817: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 43681 1727204693.96822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 43681 1727204693.96847: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 43681 1727204693.96879: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de95f740> <<< 43681 1727204693.96907: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de95e360> <<< 43681 1727204693.96931: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de91a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de902f30> <<< 43681 1727204693.97006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 43681 1727204693.97025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de990740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 43681 1727204693.97071: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.97098: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de990bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de990aa0> <<< 43681 1727204693.97125: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de990e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8fede0> <<< 43681 1727204693.97163: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 43681 1727204693.97186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.97224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 43681 1727204693.97234: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de991520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9911f0> <<< 43681 1727204693.97241: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 43681 1727204693.97270: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 43681 1727204693.97303: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992420> <<< 43681 1727204693.97322: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 43681 1727204693.97352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 43681 1727204693.97399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 43681 1727204693.97424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9ac650> <<< 43681 1727204693.97459: stdout chunk (state=3): >>>import 'errno' # <<< 43681 1727204693.97473: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.97480: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9add60> <<< 43681 1727204693.97507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 43681 1727204693.97510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 43681 1727204693.97542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 43681 1727204693.97563: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9aec60> <<< 43681 1727204693.97593: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.97628: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9af2c0> <<< 43681 1727204693.97632: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9ae1b0> <<< 43681 1727204693.97646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 43681 1727204693.97704: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.97716: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9af470> <<< 43681 1727204693.97757: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992480> <<< 43681 1727204693.97810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 43681 1727204693.97814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 43681 1727204693.97841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 43681 1727204693.97880: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6abcb0> <<< 43681 1727204693.97906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 43681 1727204693.97937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d47a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d4500> <<< 43681 1727204693.97996: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d47d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204693.98037: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d49b0> <<< 43681 1727204693.98049: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6a9e50> <<< 43681 1727204693.98053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 43681 1727204693.98146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 43681 1727204693.98179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 43681 1727204693.98206: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d6000> <<< 43681 1727204693.98237: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d4c80> <<< 43681 1727204693.98240: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992b70> <<< 43681 1727204693.98269: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 43681 1727204693.98318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.98346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 43681 1727204693.98386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 43681 1727204693.98418: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de7023c0> <<< 43681 1727204693.98473: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 43681 1727204693.98497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204693.98523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 43681 1727204693.98545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 43681 1727204693.98582: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de71a510> <<< 43681 1727204693.98614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 43681 1727204693.98675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 43681 1727204693.98717: stdout chunk (state=3): >>>import 'ntpath' # <<< 43681 1727204693.98730: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 43681 1727204693.98738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de7532f0> <<< 43681 1727204693.98752: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 43681 1727204693.98797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 43681 1727204693.98821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 43681 1727204693.98868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 43681 1727204693.98959: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de779a90> <<< 43681 1727204693.99040: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de753410> <<< 43681 1727204693.99083: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de71b1a0> <<< 43681 1727204693.99117: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 43681 1727204693.99130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de594410> <<< 43681 1727204693.99144: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de719550> <<< 43681 1727204693.99153: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d6f60> <<< 43681 1727204693.99315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 43681 1727204693.99336: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f17de5946e0> <<< 43681 1727204693.99515: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_g9cu3uk_/ansible_setup_payload.zip' <<< 43681 1727204693.99526: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.99673: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204693.99711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 43681 1727204693.99717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 43681 1727204693.99763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 43681 1727204693.99838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 43681 1727204693.99873: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6021e0><<< 43681 1727204693.99885: stdout chunk (state=3): >>> <<< 43681 1727204693.99892: stdout chunk (state=3): >>>import '_typing' # <<< 43681 1727204694.00092: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5d90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5d8230> <<< 43681 1727204694.00108: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.00136: stdout chunk (state=3): >>>import 'ansible' # <<< 43681 1727204694.00153: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.00166: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.00191: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.00201: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 43681 1727204694.00214: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.01794: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.03069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 43681 1727204694.03077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5db5f0> <<< 43681 1727204694.03104: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 43681 1727204694.03111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.03134: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 43681 1727204694.03140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 43681 1727204694.03168: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 43681 1727204694.03174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 43681 1727204694.03204: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.03210: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de631bb0> <<< 43681 1727204694.03248: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de631970> <<< 43681 1727204694.03279: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de631280> <<< 43681 1727204694.03306: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 43681 1727204694.03311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 43681 1727204694.03350: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6319d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de602e70> <<< 43681 1727204694.03371: stdout chunk (state=3): >>>import 'atexit' # <<< 43681 1727204694.03402: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.03408: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6328d0> <<< 43681 1727204694.03433: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.03436: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de632b10> <<< 43681 1727204694.03457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 43681 1727204694.03506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 43681 1727204694.03520: stdout chunk (state=3): >>>import '_locale' # <<< 43681 1727204694.03569: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de633020> <<< 43681 1727204694.03575: stdout chunk (state=3): >>>import 'pwd' # <<< 43681 1727204694.03600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 43681 1727204694.03624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 43681 1727204694.03665: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de498dd0> <<< 43681 1727204694.03696: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.03705: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de49a9f0> <<< 43681 1727204694.03723: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 43681 1727204694.03739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 43681 1727204694.03780: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49b380> <<< 43681 1727204694.03802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 43681 1727204694.03830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 43681 1727204694.03850: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49c2c0> <<< 43681 1727204694.03869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 43681 1727204694.03911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 43681 1727204694.03935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 43681 1727204694.03940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 43681 1727204694.03997: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49efc0> <<< 43681 1727204694.04037: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de49f0e0> <<< 43681 1727204694.04062: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49d280> <<< 43681 1727204694.04081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 43681 1727204694.04118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 43681 1727204694.04136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 43681 1727204694.04147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 43681 1727204694.04160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 43681 1727204694.04195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 43681 1727204694.04225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 43681 1727204694.04238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 43681 1727204694.04242: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a2f60> <<< 43681 1727204694.04251: stdout chunk (state=3): >>>import '_tokenize' # <<< 43681 1727204694.04322: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a1a30> <<< 43681 1727204694.04331: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a1790> <<< 43681 1727204694.04351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 43681 1727204694.04354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 43681 1727204694.04433: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a3b60> <<< 43681 1727204694.04469: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49d790> <<< 43681 1727204694.04497: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.04503: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4e7140> <<< 43681 1727204694.04536: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4e72c0> <<< 43681 1727204694.04555: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 43681 1727204694.04578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 43681 1727204694.04603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 43681 1727204694.04609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 43681 1727204694.04641: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 43681 1727204694.04645: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4ece90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ecc50> <<< 43681 1727204694.04664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 43681 1727204694.04784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 43681 1727204694.04835: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.04841: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4ef3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ed580> <<< 43681 1727204694.04876: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 43681 1727204694.04917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.04941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 43681 1727204694.04955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 43681 1727204694.04972: stdout chunk (state=3): >>>import '_string' # <<< 43681 1727204694.05018: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f6b40> <<< 43681 1727204694.05179: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ef4d0> <<< 43681 1727204694.05258: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.05264: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7920> <<< 43681 1727204694.05298: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.05305: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7b90> <<< 43681 1727204694.05352: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7ec0> <<< 43681 1727204694.05370: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4e75c0> <<< 43681 1727204694.05397: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 43681 1727204694.05424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 43681 1727204694.05449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 43681 1727204694.05479: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.05511: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.05520: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fb5c0> <<< 43681 1727204694.05702: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.05722: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fc680> <<< 43681 1727204694.05730: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f9d30> <<< 43681 1727204694.05772: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fb0b0> <<< 43681 1727204694.05779: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f9910> <<< 43681 1727204694.05787: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.05807: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 43681 1727204694.05833: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.05933: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.06039: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.06059: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 43681 1727204694.06078: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.06099: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 43681 1727204694.06121: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.06259: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.06406: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.07076: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.07761: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 43681 1727204694.07781: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 43681 1727204694.07810: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 43681 1727204694.07830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.07887: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de384860> <<< 43681 1727204694.08003: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 43681 1727204694.08033: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de385700> <<< 43681 1727204694.08039: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f8230> <<< 43681 1727204694.08101: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 43681 1727204694.08108: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.08135: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.08153: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 43681 1727204694.08165: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.08337: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.08524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 43681 1727204694.08552: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de385dc0> <<< 43681 1727204694.08560: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09130: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09683: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09770: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09860: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 43681 1727204694.09874: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09913: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.09961: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 43681 1727204694.09967: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10050: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10166: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 43681 1727204694.10187: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10203: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10213: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 43681 1727204694.10223: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10273: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10312: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 43681 1727204694.10329: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10605: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.10905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 43681 1727204694.10975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 43681 1727204694.10998: stdout chunk (state=3): >>>import '_ast' # <<< 43681 1727204694.11091: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de3865d0> <<< 43681 1727204694.11108: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11195: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11281: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 43681 1727204694.11291: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 43681 1727204694.11302: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 43681 1727204694.11328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 43681 1727204694.11334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 43681 1727204694.11421: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.11544: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38e060> <<< 43681 1727204694.11606: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38e9f0> <<< 43681 1727204694.11629: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de387440> <<< 43681 1727204694.11635: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11685: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11729: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 43681 1727204694.11735: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11783: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11834: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11895: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.11969: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 43681 1727204694.12011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.12109: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38d8b0> <<< 43681 1727204694.12156: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de38eb40> <<< 43681 1727204694.12191: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 43681 1727204694.12195: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 43681 1727204694.12206: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12273: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12337: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12373: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12417: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 43681 1727204694.12433: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.12439: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 43681 1727204694.12467: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 43681 1727204694.12487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 43681 1727204694.12545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 43681 1727204694.12565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 43681 1727204694.12586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 43681 1727204694.12645: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de422d20> <<< 43681 1727204694.12700: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de398a70> <<< 43681 1727204694.12785: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de396ba0> <<< 43681 1727204694.12799: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de3969f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 43681 1727204694.12806: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12831: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12861: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 43681 1727204694.12926: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 43681 1727204694.12936: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.12951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 43681 1727204694.12975: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13035: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13105: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13122: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13151: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13194: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13240: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13278: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 43681 1727204694.13328: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13418: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13490: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13520: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13554: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 43681 1727204694.13567: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13761: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.13959: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.14000: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.14057: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.14091: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 43681 1727204694.14109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 43681 1727204694.14125: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 43681 1727204694.14147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 43681 1727204694.14174: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de425a90> <<< 43681 1727204694.14201: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 43681 1727204694.14210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 43681 1727204694.14230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 43681 1727204694.14276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 43681 1727204694.14305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 43681 1727204694.14320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 43681 1727204694.14327: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd948350> <<< 43681 1727204694.14357: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.14369: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9486b0> <<< 43681 1727204694.14424: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4053d0> <<< 43681 1727204694.14443: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de404620> <<< 43681 1727204694.14478: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4241a0> <<< 43681 1727204694.14491: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de427c20> <<< 43681 1727204694.14520: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 43681 1727204694.14561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 43681 1727204694.14586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 43681 1727204694.14597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 43681 1727204694.14617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 43681 1727204694.14626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 43681 1727204694.14661: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd94b650> <<< 43681 1727204694.14669: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94af00> <<< 43681 1727204694.14693: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.14705: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd94b0e0> <<< 43681 1727204694.14711: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94a330> <<< 43681 1727204694.14735: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 43681 1727204694.14836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 43681 1727204694.14853: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94b830> <<< 43681 1727204694.14871: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 43681 1727204694.14906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 43681 1727204694.14938: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9b2330> <<< 43681 1727204694.14971: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b0350> <<< 43681 1727204694.15002: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de427e00> import 'ansible.module_utils.facts.timeout' # <<< 43681 1727204694.15019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 43681 1727204694.15036: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15055: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 43681 1727204694.15077: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15135: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15198: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 43681 1727204694.15216: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15273: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 43681 1727204694.15340: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15348: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 43681 1727204694.15374: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15407: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 43681 1727204694.15449: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15499: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 43681 1727204694.15561: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15611: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 43681 1727204694.15665: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15726: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15792: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15852: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.15915: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 43681 1727204694.15921: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 43681 1727204694.15939: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.16473: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.16967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 43681 1727204694.16978: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17030: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17096: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17130: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17168: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 43681 1727204694.17185: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17215: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 43681 1727204694.17255: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17323: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17375: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 43681 1727204694.17398: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17428: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17462: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 43681 1727204694.17468: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17505: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 43681 1727204694.17551: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17628: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.17726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 43681 1727204694.17760: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b2660> <<< 43681 1727204694.17779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 43681 1727204694.17811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 43681 1727204694.17939: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b3290> <<< 43681 1727204694.17949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 43681 1727204694.17959: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18034: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 43681 1727204694.18116: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18214: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18315: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 43681 1727204694.18327: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18399: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 43681 1727204694.18487: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18532: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.18585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 43681 1727204694.18630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 43681 1727204694.18710: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.18773: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9e6720> <<< 43681 1727204694.18986: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9cf170> import 'ansible.module_utils.facts.system.python' # <<< 43681 1727204694.19002: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19060: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 43681 1727204694.19126: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19221: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19311: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19435: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19599: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 43681 1727204694.19606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 43681 1727204694.19651: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 43681 1727204694.19708: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19748: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 43681 1727204694.19841: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.19870: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd802000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd801d00> import 'ansible.module_utils.facts.system.user' # <<< 43681 1727204694.19894: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19899: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 43681 1727204694.19921: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.19963: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 43681 1727204694.20014: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20191: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 43681 1727204694.20365: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20477: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20587: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20632: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 43681 1727204694.20687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 43681 1727204694.20697: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20724: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20738: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.20897: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.21055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 43681 1727204694.21062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 43681 1727204694.21068: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.21205: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.21337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 43681 1727204694.21353: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.21389: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.21428: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.22045: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.22618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 43681 1727204694.22640: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.22750: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.22870: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 43681 1727204694.22874: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.22981: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 43681 1727204694.23104: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23268: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 43681 1727204694.23469: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 43681 1727204694.23493: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23533: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 43681 1727204694.23590: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23700: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.23803: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24037: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24264: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 43681 1727204694.24273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 43681 1727204694.24279: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24319: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 43681 1727204694.24374: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24395: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 43681 1727204694.24430: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24511: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 43681 1727204694.24596: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24617: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 43681 1727204694.24656: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24718: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 43681 1727204694.24793: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24848: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.24918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 43681 1727204694.24926: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25218: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 43681 1727204694.25516: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25582: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 43681 1727204694.25652: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25695: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 43681 1727204694.25740: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25775: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 43681 1727204694.25822: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25858: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 43681 1727204694.25905: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.25992: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 43681 1727204694.26097: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26111: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 43681 1727204694.26129: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26172: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 43681 1727204694.26241: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26250: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26276: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26326: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26381: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26456: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 43681 1727204694.26550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 43681 1727204694.26559: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26609: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 43681 1727204694.26676: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.26895: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 43681 1727204694.27121: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27171: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 43681 1727204694.27231: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27278: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 43681 1727204694.27344: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27431: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 43681 1727204694.27527: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 43681 1727204694.27541: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27632: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.27730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 43681 1727204694.27738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 43681 1727204694.27818: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.28459: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 43681 1727204694.28465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 43681 1727204694.28495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 43681 1727204694.28509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 43681 1727204694.28542: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.28550: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd82b200> <<< 43681 1727204694.28569: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd829c40> <<< 43681 1727204694.28625: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd82a360> <<< 43681 1727204694.29301: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "cons<<< 43681 1727204694.29330: stdout chunk (state=3): >>>ole": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "54", "epoch": "1727204694", "epoch_int": "1727204694", "date": "2024-09-24", "time": "15:04:54", "iso8601_micro": "2024-09-24T19:04:54.287939Z", "iso8601": "2024-09-24T19:04:54Z", "iso8601_basic": "20240924T150454287939", "iso8601_basic_short": "20240924T150454", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204694.29892: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 43681 1727204694.29915: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 43681 1727204694.29944: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 43681 1727204694.29951: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum <<< 43681 1727204694.29979: stdout chunk (state=3): >>># cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil <<< 43681 1727204694.30008: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 43681 1727204694.30025: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 43681 1727204694.30049: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors <<< 43681 1727204694.30080: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro <<< 43681 1727204694.30112: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time <<< 43681 1727204694.30122: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin <<< 43681 1727204694.30150: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd <<< 43681 1727204694.30175: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 43681 1727204694.30513: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 43681 1727204694.30528: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util<<< 43681 1727204694.30544: stdout chunk (state=3): >>> <<< 43681 1727204694.30552: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 43681 1727204694.30571: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2<<< 43681 1727204694.30588: stdout chunk (state=3): >>> # destroy lzma # destroy zipfile._path <<< 43681 1727204694.30597: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 43681 1727204694.30637: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 43681 1727204694.30653: stdout chunk (state=3): >>># destroy zipimport <<< 43681 1727204694.30668: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 43681 1727204694.30687: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 43681 1727204694.30705: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 43681 1727204694.30712: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 43681 1727204694.30769: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 43681 1727204694.30781: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 43681 1727204694.30833: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 43681 1727204694.30842: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 43681 1727204694.30871: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 43681 1727204694.30894: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 43681 1727204694.30899: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util <<< 43681 1727204694.30917: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex <<< 43681 1727204694.30936: stdout chunk (state=3): >>># destroy fcntl # destroy datetime <<< 43681 1727204694.30942: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 43681 1727204694.30962: stdout chunk (state=3): >>># destroy _ssl <<< 43681 1727204694.30988: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 43681 1727204694.31000: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 43681 1727204694.31015: stdout chunk (state=3): >>># destroy errno # destroy json <<< 43681 1727204694.31030: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 43681 1727204694.31039: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 43681 1727204694.31079: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 43681 1727204694.31122: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 43681 1727204694.31126: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 43681 1727204694.31132: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 43681 1727204694.31155: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 43681 1727204694.31170: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 43681 1727204694.31185: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 43681 1727204694.31202: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 43681 1727204694.31218: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 43681 1727204694.31235: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 43681 1727204694.31255: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 43681 1727204694.31271: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 43681 1727204694.31288: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux <<< 43681 1727204694.31297: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 43681 1727204694.31429: stdout chunk (state=3): >>># destroy sys.monitoring <<< 43681 1727204694.31441: stdout chunk (state=3): >>># destroy _socket <<< 43681 1727204694.31448: stdout chunk (state=3): >>># destroy _collections <<< 43681 1727204694.31481: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 43681 1727204694.31492: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 43681 1727204694.31514: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 43681 1727204694.31554: stdout chunk (state=3): >>># destroy _typing <<< 43681 1727204694.31559: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse <<< 43681 1727204694.31574: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 43681 1727204694.31596: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 43681 1727204694.31609: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 43681 1727204694.31707: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 43681 1727204694.31719: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 43681 1727204694.31725: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 43681 1727204694.31748: stdout chunk (state=3): >>># destroy _random <<< 43681 1727204694.31757: stdout chunk (state=3): >>># destroy _weakref <<< 43681 1727204694.31785: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre<<< 43681 1727204694.31792: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools <<< 43681 1727204694.31819: stdout chunk (state=3): >>># destroy _abc <<< 43681 1727204694.31829: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 43681 1727204694.31836: stdout chunk (state=3): >>># clear sys.audit hooks <<< 43681 1727204694.32275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204694.32342: stderr chunk (state=3): >>><<< 43681 1727204694.32345: stdout chunk (state=3): >>><<< 43681 1727204694.32454: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dead44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17deaa3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dead6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8c50a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8c5fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de903e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de903f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de93b860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de93bef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de91bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de919280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de901040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de95f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de95e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de91a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de902f30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de990740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de990bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de990aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de990e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de8fede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de991520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9911f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9ac650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9add60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9aec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9ae1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de9afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de9af470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6abcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d47a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d4500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d47d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6d49b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6a9e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d6000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d4c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de992b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de7023c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de71a510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de7532f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de779a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de753410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de71b1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de594410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de719550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6d6f60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f17de5946e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_g9cu3uk_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6021e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5d90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5d8230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de5db5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de631bb0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de631970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de631280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de6319d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de602e70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de6328d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de632b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de633020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de498dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de49a9f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49b380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49c2c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49efc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de49f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a2f60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a1a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a1790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4a3b60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de49d790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4e7140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4e72c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4ece90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ecc50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4ef3e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ed580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f6b40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4ef4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4f7ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4e75c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fb5c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fc680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f9d30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de4fb0b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f9910> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de384860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de385700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4f8230> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de385dc0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de3865d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38e060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38e9f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de387440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17de38d8b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de38eb40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de422d20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de398a70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de396ba0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de3969f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de425a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd948350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9486b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4053d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de404620> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de4241a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de427c20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd94b650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94af00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd94b0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94a330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd94b830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9b2330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b0350> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17de427e00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b2660> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9b3290> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd9e6720> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd9cf170> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd802000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd801d00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f17dd82b200> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd829c40> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f17dd82a360> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "54", "epoch": "1727204694", "epoch_int": "1727204694", "date": "2024-09-24", "time": "15:04:54", "iso8601_micro": "2024-09-24T19:04:54.287939Z", "iso8601": "2024-09-24T19:04:54Z", "iso8601_basic": "20240924T150454287939", "iso8601_basic_short": "20240924T150454", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 43681 1727204694.33327: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204694.33331: _low_level_execute_command(): starting 43681 1727204694.33334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204693.8071036-43864-177536372354426/ > /dev/null 2>&1 && sleep 0' 43681 1727204694.33388: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204694.33394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204694.33396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.33398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.33401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.33455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.33465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.33500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.35415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.35466: stderr chunk (state=3): >>><<< 43681 1727204694.35470: stdout chunk (state=3): >>><<< 43681 1727204694.35486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204694.35495: handler run complete 43681 1727204694.35540: variable 'ansible_facts' from source: unknown 43681 1727204694.35588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.35695: variable 'ansible_facts' from source: unknown 43681 1727204694.35741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.35788: attempt loop complete, returning result 43681 1727204694.35793: _execute() done 43681 1727204694.35798: dumping result to json 43681 1727204694.35809: done dumping result, returning 43681 1727204694.35818: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-9e86-7728-0000000000c0] 43681 1727204694.35828: sending task result for task 12b410aa-8751-9e86-7728-0000000000c0 43681 1727204694.35979: done sending task result for task 12b410aa-8751-9e86-7728-0000000000c0 43681 1727204694.35982: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204694.36172: no more pending results, returning what we have 43681 1727204694.36175: results queue empty 43681 1727204694.36176: checking for any_errors_fatal 43681 1727204694.36177: done checking for any_errors_fatal 43681 1727204694.36178: checking for max_fail_percentage 43681 1727204694.36180: done checking for max_fail_percentage 43681 1727204694.36180: checking to see if all hosts have failed and the running result is not ok 43681 1727204694.36181: done checking to see if all hosts have failed 43681 1727204694.36182: getting the remaining hosts for this loop 43681 1727204694.36183: done getting the remaining hosts for this loop 43681 1727204694.36187: getting the next task for host managed-node3 43681 1727204694.36197: done getting next task for host managed-node3 43681 1727204694.36204: ^ task is: TASK: Check if system is ostree 43681 1727204694.36207: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204694.36211: getting variables 43681 1727204694.36213: in VariableManager get_vars() 43681 1727204694.36234: Calling all_inventory to load vars for managed-node3 43681 1727204694.36237: Calling groups_inventory to load vars for managed-node3 43681 1727204694.36240: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204694.36248: Calling all_plugins_play to load vars for managed-node3 43681 1727204694.36251: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204694.36253: Calling groups_plugins_play to load vars for managed-node3 43681 1727204694.36400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.36576: done with get_vars() 43681 1727204694.36584: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.637) 0:00:02.033 ***** 43681 1727204694.36663: entering _queue_task() for managed-node3/stat 43681 1727204694.36880: worker is 1 (out of 1 available) 43681 1727204694.36895: exiting _queue_task() for managed-node3/stat 43681 1727204694.36907: done queuing things up, now waiting for results queue to drain 43681 1727204694.36909: waiting for pending results... 43681 1727204694.37070: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 43681 1727204694.37160: in run() - task 12b410aa-8751-9e86-7728-0000000000c2 43681 1727204694.37172: variable 'ansible_search_path' from source: unknown 43681 1727204694.37175: variable 'ansible_search_path' from source: unknown 43681 1727204694.37211: calling self._execute() 43681 1727204694.37280: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.37287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.37299: variable 'omit' from source: magic vars 43681 1727204694.37701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204694.37931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204694.37970: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204694.38001: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204694.38033: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204694.38106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204694.38135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204694.38157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204694.38179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204694.38283: Evaluated conditional (not __network_is_ostree is defined): True 43681 1727204694.38290: variable 'omit' from source: magic vars 43681 1727204694.38322: variable 'omit' from source: magic vars 43681 1727204694.38358: variable 'omit' from source: magic vars 43681 1727204694.38379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204694.38405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204694.38422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204694.38441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204694.38450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204694.38478: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204694.38481: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.38486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.38568: Set connection var ansible_shell_type to sh 43681 1727204694.38573: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204694.38581: Set connection var ansible_timeout to 10 43681 1727204694.38591: Set connection var ansible_pipelining to False 43681 1727204694.38597: Set connection var ansible_connection to ssh 43681 1727204694.38604: Set connection var ansible_shell_executable to /bin/sh 43681 1727204694.38623: variable 'ansible_shell_executable' from source: unknown 43681 1727204694.38626: variable 'ansible_connection' from source: unknown 43681 1727204694.38630: variable 'ansible_module_compression' from source: unknown 43681 1727204694.38632: variable 'ansible_shell_type' from source: unknown 43681 1727204694.38637: variable 'ansible_shell_executable' from source: unknown 43681 1727204694.38640: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.38646: variable 'ansible_pipelining' from source: unknown 43681 1727204694.38648: variable 'ansible_timeout' from source: unknown 43681 1727204694.38655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.38773: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204694.38786: variable 'omit' from source: magic vars 43681 1727204694.38791: starting attempt loop 43681 1727204694.38794: running the handler 43681 1727204694.38805: _low_level_execute_command(): starting 43681 1727204694.38813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204694.39355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.39359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.39361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.39364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.39415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.39419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204694.39421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.39471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.41142: stdout chunk (state=3): >>>/root <<< 43681 1727204694.41250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.41306: stderr chunk (state=3): >>><<< 43681 1727204694.41310: stdout chunk (state=3): >>><<< 43681 1727204694.41332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204694.41345: _low_level_execute_command(): starting 43681 1727204694.41352: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301 `" && echo ansible-tmp-1727204694.4133108-43880-122564553418301="` echo /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301 `" ) && sleep 0' 43681 1727204694.41782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.41820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204694.41823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204694.41826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.41830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.41832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.41882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.41886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.41929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.43916: stdout chunk (state=3): >>>ansible-tmp-1727204694.4133108-43880-122564553418301=/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301 <<< 43681 1727204694.44034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.44081: stderr chunk (state=3): >>><<< 43681 1727204694.44084: stdout chunk (state=3): >>><<< 43681 1727204694.44106: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204694.4133108-43880-122564553418301=/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204694.44154: variable 'ansible_module_compression' from source: unknown 43681 1727204694.44203: ANSIBALLZ: Using lock for stat 43681 1727204694.44207: ANSIBALLZ: Acquiring lock 43681 1727204694.44210: ANSIBALLZ: Lock acquired: 140156139378528 43681 1727204694.44212: ANSIBALLZ: Creating module 43681 1727204694.54129: ANSIBALLZ: Writing module into payload 43681 1727204694.54210: ANSIBALLZ: Writing module 43681 1727204694.54235: ANSIBALLZ: Renaming module 43681 1727204694.54240: ANSIBALLZ: Done creating module 43681 1727204694.54257: variable 'ansible_facts' from source: unknown 43681 1727204694.54305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py 43681 1727204694.54427: Sending initial data 43681 1727204694.54431: Sent initial data (153 bytes) 43681 1727204694.54933: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204694.54937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.54940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204694.54943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204694.54945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.54998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.55003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204694.55011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.55049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.56790: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204694.56798: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204694.56828: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204694.56872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpn2mw61_j /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py <<< 43681 1727204694.56876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py" <<< 43681 1727204694.56905: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpn2mw61_j" to remote "/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py" <<< 43681 1727204694.57665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.57733: stderr chunk (state=3): >>><<< 43681 1727204694.57741: stdout chunk (state=3): >>><<< 43681 1727204694.57761: done transferring module to remote 43681 1727204694.57775: _low_level_execute_command(): starting 43681 1727204694.57780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/ /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py && sleep 0' 43681 1727204694.58244: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204694.58248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.58250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204694.58252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.58309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.58314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.58353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.60272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.60322: stderr chunk (state=3): >>><<< 43681 1727204694.60325: stdout chunk (state=3): >>><<< 43681 1727204694.60339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204694.60342: _low_level_execute_command(): starting 43681 1727204694.60349: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/AnsiballZ_stat.py && sleep 0' 43681 1727204694.60818: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.60823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.60826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204694.60828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204694.60830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.60880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.60883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.60931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.63161: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 43681 1727204694.63193: stdout chunk (state=3): >>>import _imp # builtin <<< 43681 1727204694.63229: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 43681 1727204694.63305: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 43681 1727204694.63348: stdout chunk (state=3): >>>import 'posix' # <<< 43681 1727204694.63385: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 43681 1727204694.63396: stdout chunk (state=3): >>># installing zipimport hook <<< 43681 1727204694.63412: stdout chunk (state=3): >>>import 'time' # <<< 43681 1727204694.63427: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 43681 1727204694.63481: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.63505: stdout chunk (state=3): >>>import '_codecs' # <<< 43681 1727204694.63527: stdout chunk (state=3): >>>import 'codecs' # <<< 43681 1727204694.63569: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 43681 1727204694.63608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 43681 1727204694.63612: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408d44d0> <<< 43681 1727204694.63614: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408a3ad0> <<< 43681 1727204694.63640: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 43681 1727204694.63653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 43681 1727204694.63667: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408d6a20> <<< 43681 1727204694.63677: stdout chunk (state=3): >>>import '_signal' # <<< 43681 1727204694.63706: stdout chunk (state=3): >>>import '_abc' # <<< 43681 1727204694.63713: stdout chunk (state=3): >>>import 'abc' # <<< 43681 1727204694.63734: stdout chunk (state=3): >>>import 'io' # <<< 43681 1727204694.63774: stdout chunk (state=3): >>>import '_stat' # <<< 43681 1727204694.63779: stdout chunk (state=3): >>>import 'stat' # <<< 43681 1727204694.63866: stdout chunk (state=3): >>>import '_collections_abc' # <<< 43681 1727204694.63897: stdout chunk (state=3): >>>import 'genericpath' # <<< 43681 1727204694.63902: stdout chunk (state=3): >>>import 'posixpath' # <<< 43681 1727204694.63930: stdout chunk (state=3): >>>import 'os' # <<< 43681 1727204694.63936: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 43681 1727204694.63960: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 43681 1727204694.63995: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 43681 1727204694.64001: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 43681 1727204694.64025: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 43681 1727204694.64041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 43681 1727204694.64059: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406c50a0> <<< 43681 1727204694.64126: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 43681 1727204694.64133: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.64139: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406c5fd0> <<< 43681 1727204694.64171: stdout chunk (state=3): >>>import 'site' # <<< 43681 1727204694.64205: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 43681 1727204694.64448: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 43681 1727204694.64461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 43681 1727204694.64481: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 43681 1727204694.64496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.64519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 43681 1727204694.64558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 43681 1727204694.64579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 43681 1727204694.64602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 43681 1727204694.64618: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940703e90> <<< 43681 1727204694.64638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 43681 1727204694.64657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 43681 1727204694.64683: stdout chunk (state=3): >>>import '_operator' # <<< 43681 1727204694.64695: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940703f50> <<< 43681 1727204694.64708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 43681 1727204694.64734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 43681 1727204694.64761: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 43681 1727204694.64811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.64830: stdout chunk (state=3): >>>import 'itertools' # <<< 43681 1727204694.64866: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 43681 1727204694.64874: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994073b860> <<< 43681 1727204694.64893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 43681 1727204694.64896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 43681 1727204694.64915: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994073bef0> <<< 43681 1727204694.64923: stdout chunk (state=3): >>>import '_collections' # <<< 43681 1727204694.64972: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994071bb60> <<< 43681 1727204694.64984: stdout chunk (state=3): >>>import '_functools' # <<< 43681 1727204694.65014: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940719280> <<< 43681 1727204694.65114: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940701040> <<< 43681 1727204694.65141: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 43681 1727204694.65163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 43681 1727204694.65181: stdout chunk (state=3): >>>import '_sre' # <<< 43681 1727204694.65203: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 43681 1727204694.65224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 43681 1727204694.65248: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 43681 1727204694.65254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 43681 1727204694.65300: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994075f740> <<< 43681 1727204694.65307: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994075e360> <<< 43681 1727204694.65336: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 43681 1727204694.65343: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994071a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940702f30> <<< 43681 1727204694.65409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 43681 1727204694.65413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 43681 1727204694.65416: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940790740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407002c0> <<< 43681 1727204694.65440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 43681 1727204694.65445: stdout chunk (state=3): >>> <<< 43681 1727204694.65475: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.65478: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940790bf0> <<< 43681 1727204694.65484: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940790aa0> <<< 43681 1727204694.65530: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.65537: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940790e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406fede0> <<< 43681 1727204694.65569: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.65597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 43681 1727204694.65628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 43681 1727204694.65640: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940791520> <<< 43681 1727204694.65662: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407911f0> <<< 43681 1727204694.65667: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 43681 1727204694.65692: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 43681 1727204694.65702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 43681 1727204694.65720: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792420> <<< 43681 1727204694.65732: stdout chunk (state=3): >>>import 'importlib.util' # <<< 43681 1727204694.65747: stdout chunk (state=3): >>>import 'runpy' # <<< 43681 1727204694.65764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 43681 1727204694.65801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 43681 1727204694.65826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 43681 1727204694.65830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407ac650> <<< 43681 1727204694.65855: stdout chunk (state=3): >>>import 'errno' # <<< 43681 1727204694.65884: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.65893: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407add60> <<< 43681 1727204694.65912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 43681 1727204694.65927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 43681 1727204694.65957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 43681 1727204694.65976: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407aec60> <<< 43681 1727204694.66018: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.66023: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407ae1b0><<< 43681 1727204694.66030: stdout chunk (state=3): >>> <<< 43681 1727204694.66057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 43681 1727204694.66065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 43681 1727204694.66109: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.66141: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407afd40> <<< 43681 1727204694.66144: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407af470> <<< 43681 1727204694.66264: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 43681 1727204694.66311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994052fcb0> <<< 43681 1727204694.66373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99405587a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940558500> <<< 43681 1727204694.66404: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940558620> <<< 43681 1727204694.66451: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940558980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994052de50> <<< 43681 1727204694.66469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 43681 1727204694.66566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 43681 1727204694.66597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 43681 1727204694.66613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 43681 1727204694.66616: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994055a090> <<< 43681 1727204694.66635: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940558d10> <<< 43681 1727204694.66658: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792b70> <<< 43681 1727204694.66685: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 43681 1727204694.66737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.66759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 43681 1727204694.66808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 43681 1727204694.66834: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940582450> <<< 43681 1727204694.66886: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 43681 1727204694.66902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.66925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 43681 1727204694.66945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 43681 1727204694.66996: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059e5a0> <<< 43681 1727204694.67019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 43681 1727204694.67061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 43681 1727204694.67120: stdout chunk (state=3): >>>import 'ntpath' # <<< 43681 1727204694.67145: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 43681 1727204694.67157: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405d72f0> <<< 43681 1727204694.67175: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 43681 1727204694.67206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 43681 1727204694.67235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 43681 1727204694.67276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 43681 1727204694.67369: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405fda90> <<< 43681 1727204694.67447: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405d7410> <<< 43681 1727204694.67496: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059f230> <<< 43681 1727204694.67522: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 43681 1727204694.67532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404143b0> <<< 43681 1727204694.67548: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059d5e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994055aff0> <<< 43681 1727204694.67652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 43681 1727204694.67670: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f994059d700> <<< 43681 1727204694.67752: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_xmlmpoce/ansible_stat_payload.zip' # zipimport: zlib available <<< 43681 1727204694.67911: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.67937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 43681 1727204694.67952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 43681 1727204694.67993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 43681 1727204694.68073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 43681 1727204694.68110: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 43681 1727204694.68117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994046e030> import '_typing' # <<< 43681 1727204694.68325: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940444f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940417fb0> <<< 43681 1727204694.68335: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.68366: stdout chunk (state=3): >>>import 'ansible' # <<< 43681 1727204694.68375: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.68413: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.68417: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.68442: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 43681 1727204694.70013: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.71409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940447ec0> <<< 43681 1727204694.71433: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940499b20> <<< 43681 1727204694.71456: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404998b0> <<< 43681 1727204694.71508: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404991c0> <<< 43681 1727204694.71641: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 43681 1727204694.71646: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940499610> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994046ecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994049a8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994049ab10> <<< 43681 1727204694.71662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 43681 1727204694.71711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 43681 1727204694.71735: stdout chunk (state=3): >>>import '_locale' # <<< 43681 1727204694.71767: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994049aff0> <<< 43681 1727204694.71883: stdout chunk (state=3): >>>import 'pwd' # <<< 43681 1727204694.71887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 43681 1727204694.71920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402f8e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99402faa20> <<< 43681 1727204694.71999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 43681 1727204694.72003: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fb380> <<< 43681 1727204694.72067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 43681 1727204694.72074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 43681 1727204694.72186: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fc2c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 43681 1727204694.72228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402ff020> <<< 43681 1727204694.72241: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99402ff110> <<< 43681 1727204694.72272: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fd2e0> <<< 43681 1727204694.72276: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 43681 1727204694.72334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 43681 1727204694.72337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 43681 1727204694.72350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 43681 1727204694.72374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 43681 1727204694.72429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 43681 1727204694.72437: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940302fc0> import '_tokenize' # <<< 43681 1727204694.72643: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940301a90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403017f0> <<< 43681 1727204694.72650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 43681 1727204694.72674: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940303f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fd7f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994034b200> <<< 43681 1727204694.72711: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994034b350> <<< 43681 1727204694.72737: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 43681 1727204694.72772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 43681 1727204694.72801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 43681 1727204694.72858: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940350f50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940350ce0> <<< 43681 1727204694.72869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 43681 1727204694.72954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 43681 1727204694.73068: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940353410> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940351550> <<< 43681 1727204694.73112: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 43681 1727204694.73126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 43681 1727204694.73191: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940356c00> <<< 43681 1727204694.73334: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403535c0> <<< 43681 1727204694.73424: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357c20> <<< 43681 1727204694.73518: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357c80> <<< 43681 1727204694.73555: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357d70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994034b680> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 43681 1727204694.73576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 43681 1727204694.73657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.73674: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035b680> <<< 43681 1727204694.73865: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.73937: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035c890> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940359df0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035b170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403599d0> <<< 43681 1727204694.74021: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74024: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 43681 1727204694.74027: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74096: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74204: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74339: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 43681 1727204694.74360: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74398: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.74538: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.75233: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.76055: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403e48f0> <<< 43681 1727204694.76151: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 43681 1727204694.76177: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e5670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940358980> <<< 43681 1727204694.76234: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 43681 1727204694.76251: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.76299: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 43681 1727204694.76303: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.76466: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.76657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 43681 1727204694.76682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e5430> # zipimport: zlib available <<< 43681 1727204694.77248: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.77814: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.77884: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.77975: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 43681 1727204694.77994: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78030: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78070: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 43681 1727204694.78092: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78164: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78281: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 43681 1727204694.78313: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 43681 1727204694.78334: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78379: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78425: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 43681 1727204694.78436: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.78716: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 43681 1727204694.79079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 43681 1727204694.79098: stdout chunk (state=3): >>>import '_ast' # <<< 43681 1727204694.79181: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e6600> <<< 43681 1727204694.79195: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79272: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79368: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 43681 1727204694.79390: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 43681 1727204694.79420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 43681 1727204694.79501: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.79613: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403ee330> <<< 43681 1727204694.79665: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.79672: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403eec30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e74a0> <<< 43681 1727204694.79700: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79740: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79794: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 43681 1727204694.79844: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79891: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.79952: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80024: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 43681 1727204694.80065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.80156: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 43681 1727204694.80165: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403edac0> <<< 43681 1727204694.80202: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403eede0> <<< 43681 1727204694.80233: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 43681 1727204694.80248: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80313: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80383: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80409: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80458: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 43681 1727204694.80484: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 43681 1727204694.80506: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 43681 1727204694.80534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 43681 1727204694.80591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 43681 1727204694.80611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 43681 1727204694.80629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 43681 1727204694.80691: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994027eff0> <<< 43681 1727204694.80733: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401fbe60> <<< 43681 1727204694.80828: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401f6f60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401f6cc0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 43681 1727204694.80835: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80863: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80895: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 43681 1727204694.80949: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 43681 1727204694.80968: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80991: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.80995: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 43681 1727204694.81002: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.81143: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.81356: stdout chunk (state=3): >>># zipimport: zlib available <<< 43681 1727204694.81504: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 43681 1727204694.81833: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 43681 1727204694.81859: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 43681 1727204694.81906: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 43681 1727204694.81927: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 43681 1727204694.81966: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 43681 1727204694.82002: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 43681 1727204694.82023: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 43681 1727204694.82250: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 43681 1727204694.82309: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 43681 1727204694.82314: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 43681 1727204694.82346: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 43681 1727204694.82404: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 43681 1727204694.82411: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 43681 1727204694.82465: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 43681 1727204694.82470: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse <<< 43681 1727204694.82524: stdout chunk (state=3): >>># destroy json # destroy logging # destroy shlex # destroy subprocess <<< 43681 1727204694.82531: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 43681 1727204694.82588: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 43681 1727204694.82630: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 43681 1727204694.82686: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 43681 1727204694.82691: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 43681 1727204694.82714: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 43681 1727204694.82852: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 43681 1727204694.82862: stdout chunk (state=3): >>># destroy _collections <<< 43681 1727204694.82898: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 43681 1727204694.82931: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 43681 1727204694.82958: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 43681 1727204694.82972: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 43681 1727204694.83071: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 43681 1727204694.83077: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 43681 1727204694.83132: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 43681 1727204694.83137: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 43681 1727204694.83166: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 43681 1727204694.83622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.83634: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204694.83797: stdout chunk (state=3): >>><<< 43681 1727204694.83800: stderr chunk (state=3): >>><<< 43681 1727204694.83815: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408a3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99408d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406c50a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406c5fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940703e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940703f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994073b860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994073bef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994071bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940719280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940701040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994075f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994075e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994071a270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940702f30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940790740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407002c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940790bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940790aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940790e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99406fede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940791520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407911f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407ac650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407add60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407aec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407af2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407ae1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99407afd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99407af470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994052fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99405587a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940558500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940558620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940558980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994052de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994055a090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940558d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940792b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940582450> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059e5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405d72f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405fda90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99405d7410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059f230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404143b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994059d5e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994055aff0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f994059d700> # zipimport: found 30 names in '/tmp/ansible_stat_payload_xmlmpoce/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994046e030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940444f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940417fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940447ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940499b20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404998b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99404991c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940499610> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994046ecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994049a8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994049ab10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994049aff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402f8e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99402faa20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fb380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fc2c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402ff020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99402ff110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fd2e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940302fc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940301a90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403017f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940303f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99402fd7f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994034b200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994034b350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940350f50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940350ce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940353410> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940351550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940356c00> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403535c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357c20> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357c80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9940357d70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994034b680> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035b680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035c890> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940359df0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f994035b170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403599d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403e48f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e5670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9940358980> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e5430> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e6600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403ee330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403eec30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403e74a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99403edac0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99403eede0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f994027eff0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401fbe60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401f6f60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99401f6cc0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 43681 1727204694.84583: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204694.84587: _low_level_execute_command(): starting 43681 1727204694.84594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204694.4133108-43880-122564553418301/ > /dev/null 2>&1 && sleep 0' 43681 1727204694.84978: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204694.84996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204694.85013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204694.85042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204694.85060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204694.85133: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204694.85192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204694.85208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204694.85268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204694.85297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204694.87309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204694.87312: stdout chunk (state=3): >>><<< 43681 1727204694.87318: stderr chunk (state=3): >>><<< 43681 1727204694.87399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204694.87403: handler run complete 43681 1727204694.87405: attempt loop complete, returning result 43681 1727204694.87408: _execute() done 43681 1727204694.87411: dumping result to json 43681 1727204694.87414: done dumping result, returning 43681 1727204694.87428: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [12b410aa-8751-9e86-7728-0000000000c2] 43681 1727204694.87443: sending task result for task 12b410aa-8751-9e86-7728-0000000000c2 ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 43681 1727204694.87647: no more pending results, returning what we have 43681 1727204694.87650: results queue empty 43681 1727204694.87652: checking for any_errors_fatal 43681 1727204694.87659: done checking for any_errors_fatal 43681 1727204694.87660: checking for max_fail_percentage 43681 1727204694.87662: done checking for max_fail_percentage 43681 1727204694.87663: checking to see if all hosts have failed and the running result is not ok 43681 1727204694.87664: done checking to see if all hosts have failed 43681 1727204694.87665: getting the remaining hosts for this loop 43681 1727204694.87667: done getting the remaining hosts for this loop 43681 1727204694.87671: getting the next task for host managed-node3 43681 1727204694.87680: done getting next task for host managed-node3 43681 1727204694.87683: ^ task is: TASK: Set flag to indicate system is ostree 43681 1727204694.87686: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204694.87691: getting variables 43681 1727204694.87693: in VariableManager get_vars() 43681 1727204694.87729: Calling all_inventory to load vars for managed-node3 43681 1727204694.87732: Calling groups_inventory to load vars for managed-node3 43681 1727204694.87736: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204694.87748: Calling all_plugins_play to load vars for managed-node3 43681 1727204694.87753: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204694.87756: Calling groups_plugins_play to load vars for managed-node3 43681 1727204694.88380: done sending task result for task 12b410aa-8751-9e86-7728-0000000000c2 43681 1727204694.88384: WORKER PROCESS EXITING 43681 1727204694.88413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.88801: done with get_vars() 43681 1727204694.88814: done getting variables 43681 1727204694.88931: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.522) 0:00:02.556 ***** 43681 1727204694.88963: entering _queue_task() for managed-node3/set_fact 43681 1727204694.88965: Creating lock for set_fact 43681 1727204694.89268: worker is 1 (out of 1 available) 43681 1727204694.89280: exiting _queue_task() for managed-node3/set_fact 43681 1727204694.89295: done queuing things up, now waiting for results queue to drain 43681 1727204694.89297: waiting for pending results... 43681 1727204694.89572: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 43681 1727204694.89704: in run() - task 12b410aa-8751-9e86-7728-0000000000c3 43681 1727204694.89725: variable 'ansible_search_path' from source: unknown 43681 1727204694.89732: variable 'ansible_search_path' from source: unknown 43681 1727204694.89783: calling self._execute() 43681 1727204694.89879: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.89896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.89911: variable 'omit' from source: magic vars 43681 1727204694.90510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204694.90848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204694.90910: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204694.90968: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204694.91016: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204694.91127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204694.91172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204694.91212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204694.91255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204694.91491: Evaluated conditional (not __network_is_ostree is defined): True 43681 1727204694.91500: variable 'omit' from source: magic vars 43681 1727204694.91503: variable 'omit' from source: magic vars 43681 1727204694.91629: variable '__ostree_booted_stat' from source: set_fact 43681 1727204694.91684: variable 'omit' from source: magic vars 43681 1727204694.91724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204694.91758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204694.91782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204694.91815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204694.91838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204694.91880: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204694.91891: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.91901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.92039: Set connection var ansible_shell_type to sh 43681 1727204694.92056: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204694.92067: Set connection var ansible_timeout to 10 43681 1727204694.92084: Set connection var ansible_pipelining to False 43681 1727204694.92098: Set connection var ansible_connection to ssh 43681 1727204694.92144: Set connection var ansible_shell_executable to /bin/sh 43681 1727204694.92147: variable 'ansible_shell_executable' from source: unknown 43681 1727204694.92157: variable 'ansible_connection' from source: unknown 43681 1727204694.92165: variable 'ansible_module_compression' from source: unknown 43681 1727204694.92172: variable 'ansible_shell_type' from source: unknown 43681 1727204694.92180: variable 'ansible_shell_executable' from source: unknown 43681 1727204694.92187: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.92197: variable 'ansible_pipelining' from source: unknown 43681 1727204694.92205: variable 'ansible_timeout' from source: unknown 43681 1727204694.92253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.92399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204694.92418: variable 'omit' from source: magic vars 43681 1727204694.92430: starting attempt loop 43681 1727204694.92437: running the handler 43681 1727204694.92455: handler run complete 43681 1727204694.92482: attempt loop complete, returning result 43681 1727204694.92583: _execute() done 43681 1727204694.92588: dumping result to json 43681 1727204694.92590: done dumping result, returning 43681 1727204694.92593: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [12b410aa-8751-9e86-7728-0000000000c3] 43681 1727204694.92597: sending task result for task 12b410aa-8751-9e86-7728-0000000000c3 43681 1727204694.92672: done sending task result for task 12b410aa-8751-9e86-7728-0000000000c3 43681 1727204694.92676: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 43681 1727204694.92751: no more pending results, returning what we have 43681 1727204694.92755: results queue empty 43681 1727204694.92757: checking for any_errors_fatal 43681 1727204694.92767: done checking for any_errors_fatal 43681 1727204694.92768: checking for max_fail_percentage 43681 1727204694.92771: done checking for max_fail_percentage 43681 1727204694.92772: checking to see if all hosts have failed and the running result is not ok 43681 1727204694.92773: done checking to see if all hosts have failed 43681 1727204694.92774: getting the remaining hosts for this loop 43681 1727204694.92776: done getting the remaining hosts for this loop 43681 1727204694.92782: getting the next task for host managed-node3 43681 1727204694.92795: done getting next task for host managed-node3 43681 1727204694.92798: ^ task is: TASK: Fix CentOS6 Base repo 43681 1727204694.92802: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204694.92806: getting variables 43681 1727204694.92809: in VariableManager get_vars() 43681 1727204694.92843: Calling all_inventory to load vars for managed-node3 43681 1727204694.92847: Calling groups_inventory to load vars for managed-node3 43681 1727204694.92852: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204694.92866: Calling all_plugins_play to load vars for managed-node3 43681 1727204694.92870: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204694.92882: Calling groups_plugins_play to load vars for managed-node3 43681 1727204694.93368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.93663: done with get_vars() 43681 1727204694.93675: done getting variables 43681 1727204694.93826: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.048) 0:00:02.605 ***** 43681 1727204694.93859: entering _queue_task() for managed-node3/copy 43681 1727204694.94156: worker is 1 (out of 1 available) 43681 1727204694.94168: exiting _queue_task() for managed-node3/copy 43681 1727204694.94181: done queuing things up, now waiting for results queue to drain 43681 1727204694.94183: waiting for pending results... 43681 1727204694.94548: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 43681 1727204694.94595: in run() - task 12b410aa-8751-9e86-7728-0000000000c5 43681 1727204694.94599: variable 'ansible_search_path' from source: unknown 43681 1727204694.94602: variable 'ansible_search_path' from source: unknown 43681 1727204694.94752: calling self._execute() 43681 1727204694.94756: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.94759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.94762: variable 'omit' from source: magic vars 43681 1727204694.95381: variable 'ansible_distribution' from source: facts 43681 1727204694.95421: Evaluated conditional (ansible_distribution == 'CentOS'): False 43681 1727204694.95430: when evaluation is False, skipping this task 43681 1727204694.95438: _execute() done 43681 1727204694.95445: dumping result to json 43681 1727204694.95453: done dumping result, returning 43681 1727204694.95464: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [12b410aa-8751-9e86-7728-0000000000c5] 43681 1727204694.95476: sending task result for task 12b410aa-8751-9e86-7728-0000000000c5 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 43681 1727204694.95764: no more pending results, returning what we have 43681 1727204694.95769: results queue empty 43681 1727204694.95770: checking for any_errors_fatal 43681 1727204694.95775: done checking for any_errors_fatal 43681 1727204694.95776: checking for max_fail_percentage 43681 1727204694.95778: done checking for max_fail_percentage 43681 1727204694.95779: checking to see if all hosts have failed and the running result is not ok 43681 1727204694.95780: done checking to see if all hosts have failed 43681 1727204694.95781: getting the remaining hosts for this loop 43681 1727204694.95782: done getting the remaining hosts for this loop 43681 1727204694.95787: getting the next task for host managed-node3 43681 1727204694.95796: done getting next task for host managed-node3 43681 1727204694.95806: ^ task is: TASK: Include the task 'enable_epel.yml' 43681 1727204694.95810: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204694.95815: getting variables 43681 1727204694.95817: in VariableManager get_vars() 43681 1727204694.95847: Calling all_inventory to load vars for managed-node3 43681 1727204694.95851: Calling groups_inventory to load vars for managed-node3 43681 1727204694.95855: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204694.95870: Calling all_plugins_play to load vars for managed-node3 43681 1727204694.95875: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204694.95879: Calling groups_plugins_play to load vars for managed-node3 43681 1727204694.95894: done sending task result for task 12b410aa-8751-9e86-7728-0000000000c5 43681 1727204694.95898: WORKER PROCESS EXITING 43681 1727204694.96386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204694.96732: done with get_vars() 43681 1727204694.96745: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.030) 0:00:02.635 ***** 43681 1727204694.96867: entering _queue_task() for managed-node3/include_tasks 43681 1727204694.97406: worker is 1 (out of 1 available) 43681 1727204694.97414: exiting _queue_task() for managed-node3/include_tasks 43681 1727204694.97425: done queuing things up, now waiting for results queue to drain 43681 1727204694.97427: waiting for pending results... 43681 1727204694.97458: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 43681 1727204694.97660: in run() - task 12b410aa-8751-9e86-7728-0000000000c6 43681 1727204694.97665: variable 'ansible_search_path' from source: unknown 43681 1727204694.97667: variable 'ansible_search_path' from source: unknown 43681 1727204694.97670: calling self._execute() 43681 1727204694.97683: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204694.97694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204694.97707: variable 'omit' from source: magic vars 43681 1727204694.98292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204695.01200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204695.01204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204695.01208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204695.01237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204695.01265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204695.01360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204695.01396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204695.01430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204695.01480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204695.01498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204695.01634: variable '__network_is_ostree' from source: set_fact 43681 1727204695.01650: Evaluated conditional (not __network_is_ostree | d(false)): True 43681 1727204695.01659: _execute() done 43681 1727204695.01662: dumping result to json 43681 1727204695.01664: done dumping result, returning 43681 1727204695.01673: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-9e86-7728-0000000000c6] 43681 1727204695.01679: sending task result for task 12b410aa-8751-9e86-7728-0000000000c6 43681 1727204695.01878: no more pending results, returning what we have 43681 1727204695.01884: in VariableManager get_vars() 43681 1727204695.01918: Calling all_inventory to load vars for managed-node3 43681 1727204695.01922: Calling groups_inventory to load vars for managed-node3 43681 1727204695.01926: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.01936: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.01939: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.01943: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.02179: done sending task result for task 12b410aa-8751-9e86-7728-0000000000c6 43681 1727204695.02182: WORKER PROCESS EXITING 43681 1727204695.02210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.02572: done with get_vars() 43681 1727204695.02583: variable 'ansible_search_path' from source: unknown 43681 1727204695.02584: variable 'ansible_search_path' from source: unknown 43681 1727204695.02629: we have included files to process 43681 1727204695.02631: generating all_blocks data 43681 1727204695.02633: done generating all_blocks data 43681 1727204695.02638: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 43681 1727204695.02640: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 43681 1727204695.02643: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 43681 1727204695.03534: done processing included file 43681 1727204695.03542: iterating over new_blocks loaded from include file 43681 1727204695.03544: in VariableManager get_vars() 43681 1727204695.03560: done with get_vars() 43681 1727204695.03562: filtering new block on tags 43681 1727204695.03592: done filtering new block on tags 43681 1727204695.03596: in VariableManager get_vars() 43681 1727204695.03609: done with get_vars() 43681 1727204695.03611: filtering new block on tags 43681 1727204695.03625: done filtering new block on tags 43681 1727204695.03628: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 43681 1727204695.03635: extending task lists for all hosts with included blocks 43681 1727204695.03763: done extending task lists 43681 1727204695.03764: done processing included files 43681 1727204695.03765: results queue empty 43681 1727204695.03766: checking for any_errors_fatal 43681 1727204695.03770: done checking for any_errors_fatal 43681 1727204695.03771: checking for max_fail_percentage 43681 1727204695.03772: done checking for max_fail_percentage 43681 1727204695.03773: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.03774: done checking to see if all hosts have failed 43681 1727204695.03775: getting the remaining hosts for this loop 43681 1727204695.03776: done getting the remaining hosts for this loop 43681 1727204695.03779: getting the next task for host managed-node3 43681 1727204695.03783: done getting next task for host managed-node3 43681 1727204695.03785: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 43681 1727204695.03788: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.03792: getting variables 43681 1727204695.03793: in VariableManager get_vars() 43681 1727204695.03802: Calling all_inventory to load vars for managed-node3 43681 1727204695.03804: Calling groups_inventory to load vars for managed-node3 43681 1727204695.03807: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.03813: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.03821: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.03824: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.04061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.04417: done with get_vars() 43681 1727204695.04428: done getting variables 43681 1727204695.04536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 43681 1727204695.04819: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.080) 0:00:02.715 ***** 43681 1727204695.04880: entering _queue_task() for managed-node3/command 43681 1727204695.04882: Creating lock for command 43681 1727204695.05243: worker is 1 (out of 1 available) 43681 1727204695.05259: exiting _queue_task() for managed-node3/command 43681 1727204695.05273: done queuing things up, now waiting for results queue to drain 43681 1727204695.05275: waiting for pending results... 43681 1727204695.05574: running TaskExecutor() for managed-node3/TASK: Create EPEL 39 43681 1727204695.05733: in run() - task 12b410aa-8751-9e86-7728-0000000000e0 43681 1727204695.05762: variable 'ansible_search_path' from source: unknown 43681 1727204695.05771: variable 'ansible_search_path' from source: unknown 43681 1727204695.05823: calling self._execute() 43681 1727204695.05941: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.06045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.06049: variable 'omit' from source: magic vars 43681 1727204695.06492: variable 'ansible_distribution' from source: facts 43681 1727204695.06512: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 43681 1727204695.06522: when evaluation is False, skipping this task 43681 1727204695.06571: _execute() done 43681 1727204695.06575: dumping result to json 43681 1727204695.06578: done dumping result, returning 43681 1727204695.06580: done running TaskExecutor() for managed-node3/TASK: Create EPEL 39 [12b410aa-8751-9e86-7728-0000000000e0] 43681 1727204695.06595: sending task result for task 12b410aa-8751-9e86-7728-0000000000e0 43681 1727204695.06931: done sending task result for task 12b410aa-8751-9e86-7728-0000000000e0 43681 1727204695.06935: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 43681 1727204695.07028: no more pending results, returning what we have 43681 1727204695.07032: results queue empty 43681 1727204695.07033: checking for any_errors_fatal 43681 1727204695.07035: done checking for any_errors_fatal 43681 1727204695.07036: checking for max_fail_percentage 43681 1727204695.07038: done checking for max_fail_percentage 43681 1727204695.07038: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.07040: done checking to see if all hosts have failed 43681 1727204695.07041: getting the remaining hosts for this loop 43681 1727204695.07042: done getting the remaining hosts for this loop 43681 1727204695.07048: getting the next task for host managed-node3 43681 1727204695.07054: done getting next task for host managed-node3 43681 1727204695.07057: ^ task is: TASK: Install yum-utils package 43681 1727204695.07062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.07066: getting variables 43681 1727204695.07067: in VariableManager get_vars() 43681 1727204695.07097: Calling all_inventory to load vars for managed-node3 43681 1727204695.07100: Calling groups_inventory to load vars for managed-node3 43681 1727204695.07104: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.07115: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.07119: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.07123: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.07580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.08149: done with get_vars() 43681 1727204695.08165: done getting variables 43681 1727204695.08286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.034) 0:00:02.749 ***** 43681 1727204695.08321: entering _queue_task() for managed-node3/package 43681 1727204695.08323: Creating lock for package 43681 1727204695.08630: worker is 1 (out of 1 available) 43681 1727204695.08644: exiting _queue_task() for managed-node3/package 43681 1727204695.08657: done queuing things up, now waiting for results queue to drain 43681 1727204695.08658: waiting for pending results... 43681 1727204695.09049: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 43681 1727204695.09082: in run() - task 12b410aa-8751-9e86-7728-0000000000e1 43681 1727204695.09106: variable 'ansible_search_path' from source: unknown 43681 1727204695.09114: variable 'ansible_search_path' from source: unknown 43681 1727204695.09165: calling self._execute() 43681 1727204695.09261: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.09275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.09292: variable 'omit' from source: magic vars 43681 1727204695.09760: variable 'ansible_distribution' from source: facts 43681 1727204695.09779: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 43681 1727204695.09787: when evaluation is False, skipping this task 43681 1727204695.09803: _execute() done 43681 1727204695.09811: dumping result to json 43681 1727204695.09819: done dumping result, returning 43681 1727204695.09831: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [12b410aa-8751-9e86-7728-0000000000e1] 43681 1727204695.09841: sending task result for task 12b410aa-8751-9e86-7728-0000000000e1 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 43681 1727204695.10125: no more pending results, returning what we have 43681 1727204695.10130: results queue empty 43681 1727204695.10131: checking for any_errors_fatal 43681 1727204695.10141: done checking for any_errors_fatal 43681 1727204695.10142: checking for max_fail_percentage 43681 1727204695.10144: done checking for max_fail_percentage 43681 1727204695.10145: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.10146: done checking to see if all hosts have failed 43681 1727204695.10147: getting the remaining hosts for this loop 43681 1727204695.10149: done getting the remaining hosts for this loop 43681 1727204695.10154: getting the next task for host managed-node3 43681 1727204695.10161: done getting next task for host managed-node3 43681 1727204695.10164: ^ task is: TASK: Enable EPEL 7 43681 1727204695.10168: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.10173: getting variables 43681 1727204695.10175: in VariableManager get_vars() 43681 1727204695.10210: Calling all_inventory to load vars for managed-node3 43681 1727204695.10213: Calling groups_inventory to load vars for managed-node3 43681 1727204695.10217: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.10233: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.10238: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.10243: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.10633: done sending task result for task 12b410aa-8751-9e86-7728-0000000000e1 43681 1727204695.10636: WORKER PROCESS EXITING 43681 1727204695.10662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.11004: done with get_vars() 43681 1727204695.11016: done getting variables 43681 1727204695.11085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.027) 0:00:02.777 ***** 43681 1727204695.11122: entering _queue_task() for managed-node3/command 43681 1727204695.11506: worker is 1 (out of 1 available) 43681 1727204695.11518: exiting _queue_task() for managed-node3/command 43681 1727204695.11529: done queuing things up, now waiting for results queue to drain 43681 1727204695.11531: waiting for pending results... 43681 1727204695.11702: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 43681 1727204695.11846: in run() - task 12b410aa-8751-9e86-7728-0000000000e2 43681 1727204695.11871: variable 'ansible_search_path' from source: unknown 43681 1727204695.11879: variable 'ansible_search_path' from source: unknown 43681 1727204695.11927: calling self._execute() 43681 1727204695.12021: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.12036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.12050: variable 'omit' from source: magic vars 43681 1727204695.12521: variable 'ansible_distribution' from source: facts 43681 1727204695.12567: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 43681 1727204695.12571: when evaluation is False, skipping this task 43681 1727204695.12574: _execute() done 43681 1727204695.12576: dumping result to json 43681 1727204695.12578: done dumping result, returning 43681 1727204695.12582: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [12b410aa-8751-9e86-7728-0000000000e2] 43681 1727204695.12676: sending task result for task 12b410aa-8751-9e86-7728-0000000000e2 43681 1727204695.12746: done sending task result for task 12b410aa-8751-9e86-7728-0000000000e2 43681 1727204695.12750: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 43681 1727204695.12808: no more pending results, returning what we have 43681 1727204695.12812: results queue empty 43681 1727204695.12813: checking for any_errors_fatal 43681 1727204695.12822: done checking for any_errors_fatal 43681 1727204695.12823: checking for max_fail_percentage 43681 1727204695.12826: done checking for max_fail_percentage 43681 1727204695.12826: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.12828: done checking to see if all hosts have failed 43681 1727204695.12828: getting the remaining hosts for this loop 43681 1727204695.12830: done getting the remaining hosts for this loop 43681 1727204695.12836: getting the next task for host managed-node3 43681 1727204695.12843: done getting next task for host managed-node3 43681 1727204695.12846: ^ task is: TASK: Enable EPEL 8 43681 1727204695.12850: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.12855: getting variables 43681 1727204695.12857: in VariableManager get_vars() 43681 1727204695.12996: Calling all_inventory to load vars for managed-node3 43681 1727204695.13004: Calling groups_inventory to load vars for managed-node3 43681 1727204695.13008: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.13020: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.13024: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.13028: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.13409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.13712: done with get_vars() 43681 1727204695.13722: done getting variables 43681 1727204695.13786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.026) 0:00:02.804 ***** 43681 1727204695.13817: entering _queue_task() for managed-node3/command 43681 1727204695.14218: worker is 1 (out of 1 available) 43681 1727204695.14232: exiting _queue_task() for managed-node3/command 43681 1727204695.14246: done queuing things up, now waiting for results queue to drain 43681 1727204695.14248: waiting for pending results... 43681 1727204695.14550: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 43681 1727204695.14754: in run() - task 12b410aa-8751-9e86-7728-0000000000e3 43681 1727204695.14758: variable 'ansible_search_path' from source: unknown 43681 1727204695.14761: variable 'ansible_search_path' from source: unknown 43681 1727204695.14764: calling self._execute() 43681 1727204695.14843: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.14867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.14884: variable 'omit' from source: magic vars 43681 1727204695.15379: variable 'ansible_distribution' from source: facts 43681 1727204695.15417: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 43681 1727204695.15427: when evaluation is False, skipping this task 43681 1727204695.15436: _execute() done 43681 1727204695.15444: dumping result to json 43681 1727204695.15516: done dumping result, returning 43681 1727204695.15522: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [12b410aa-8751-9e86-7728-0000000000e3] 43681 1727204695.15528: sending task result for task 12b410aa-8751-9e86-7728-0000000000e3 43681 1727204695.15606: done sending task result for task 12b410aa-8751-9e86-7728-0000000000e3 43681 1727204695.15609: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 43681 1727204695.15686: no more pending results, returning what we have 43681 1727204695.15693: results queue empty 43681 1727204695.15694: checking for any_errors_fatal 43681 1727204695.15700: done checking for any_errors_fatal 43681 1727204695.15701: checking for max_fail_percentage 43681 1727204695.15704: done checking for max_fail_percentage 43681 1727204695.15704: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.15706: done checking to see if all hosts have failed 43681 1727204695.15707: getting the remaining hosts for this loop 43681 1727204695.15708: done getting the remaining hosts for this loop 43681 1727204695.15714: getting the next task for host managed-node3 43681 1727204695.15724: done getting next task for host managed-node3 43681 1727204695.15733: ^ task is: TASK: Enable EPEL 6 43681 1727204695.15738: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.15743: getting variables 43681 1727204695.15745: in VariableManager get_vars() 43681 1727204695.15779: Calling all_inventory to load vars for managed-node3 43681 1727204695.15782: Calling groups_inventory to load vars for managed-node3 43681 1727204695.15787: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.15855: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.15861: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.15866: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.16313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.16629: done with get_vars() 43681 1727204695.16641: done getting variables 43681 1727204695.16713: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.029) 0:00:02.834 ***** 43681 1727204695.16749: entering _queue_task() for managed-node3/copy 43681 1727204695.17199: worker is 1 (out of 1 available) 43681 1727204695.17208: exiting _queue_task() for managed-node3/copy 43681 1727204695.17219: done queuing things up, now waiting for results queue to drain 43681 1727204695.17221: waiting for pending results... 43681 1727204695.17349: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 43681 1727204695.17446: in run() - task 12b410aa-8751-9e86-7728-0000000000e5 43681 1727204695.17450: variable 'ansible_search_path' from source: unknown 43681 1727204695.17453: variable 'ansible_search_path' from source: unknown 43681 1727204695.17497: calling self._execute() 43681 1727204695.17586: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.17664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.17667: variable 'omit' from source: magic vars 43681 1727204695.18138: variable 'ansible_distribution' from source: facts 43681 1727204695.18157: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 43681 1727204695.18164: when evaluation is False, skipping this task 43681 1727204695.18172: _execute() done 43681 1727204695.18180: dumping result to json 43681 1727204695.18188: done dumping result, returning 43681 1727204695.18207: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [12b410aa-8751-9e86-7728-0000000000e5] 43681 1727204695.18218: sending task result for task 12b410aa-8751-9e86-7728-0000000000e5 43681 1727204695.18514: done sending task result for task 12b410aa-8751-9e86-7728-0000000000e5 43681 1727204695.18518: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 43681 1727204695.18559: no more pending results, returning what we have 43681 1727204695.18562: results queue empty 43681 1727204695.18563: checking for any_errors_fatal 43681 1727204695.18569: done checking for any_errors_fatal 43681 1727204695.18570: checking for max_fail_percentage 43681 1727204695.18572: done checking for max_fail_percentage 43681 1727204695.18573: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.18574: done checking to see if all hosts have failed 43681 1727204695.18575: getting the remaining hosts for this loop 43681 1727204695.18577: done getting the remaining hosts for this loop 43681 1727204695.18580: getting the next task for host managed-node3 43681 1727204695.18591: done getting next task for host managed-node3 43681 1727204695.18594: ^ task is: TASK: Set network provider to 'nm' 43681 1727204695.18597: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.18601: getting variables 43681 1727204695.18603: in VariableManager get_vars() 43681 1727204695.18630: Calling all_inventory to load vars for managed-node3 43681 1727204695.18633: Calling groups_inventory to load vars for managed-node3 43681 1727204695.18637: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.18648: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.18652: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.18656: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.19013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.19330: done with get_vars() 43681 1727204695.19342: done getting variables 43681 1727204695.19417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.026) 0:00:02.861 ***** 43681 1727204695.19448: entering _queue_task() for managed-node3/set_fact 43681 1727204695.19729: worker is 1 (out of 1 available) 43681 1727204695.19745: exiting _queue_task() for managed-node3/set_fact 43681 1727204695.19760: done queuing things up, now waiting for results queue to drain 43681 1727204695.19762: waiting for pending results... 43681 1727204695.20034: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 43681 1727204695.20148: in run() - task 12b410aa-8751-9e86-7728-000000000007 43681 1727204695.20171: variable 'ansible_search_path' from source: unknown 43681 1727204695.20223: calling self._execute() 43681 1727204695.20309: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.20328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.20345: variable 'omit' from source: magic vars 43681 1727204695.20594: variable 'omit' from source: magic vars 43681 1727204695.20598: variable 'omit' from source: magic vars 43681 1727204695.20601: variable 'omit' from source: magic vars 43681 1727204695.20611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204695.20659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204695.20686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204695.20719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204695.20738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204695.20778: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204695.20788: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.20799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.20928: Set connection var ansible_shell_type to sh 43681 1727204695.20945: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204695.20957: Set connection var ansible_timeout to 10 43681 1727204695.20971: Set connection var ansible_pipelining to False 43681 1727204695.20982: Set connection var ansible_connection to ssh 43681 1727204695.20995: Set connection var ansible_shell_executable to /bin/sh 43681 1727204695.21027: variable 'ansible_shell_executable' from source: unknown 43681 1727204695.21035: variable 'ansible_connection' from source: unknown 43681 1727204695.21047: variable 'ansible_module_compression' from source: unknown 43681 1727204695.21055: variable 'ansible_shell_type' from source: unknown 43681 1727204695.21154: variable 'ansible_shell_executable' from source: unknown 43681 1727204695.21157: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.21160: variable 'ansible_pipelining' from source: unknown 43681 1727204695.21162: variable 'ansible_timeout' from source: unknown 43681 1727204695.21164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.21268: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204695.21287: variable 'omit' from source: magic vars 43681 1727204695.21301: starting attempt loop 43681 1727204695.21309: running the handler 43681 1727204695.21327: handler run complete 43681 1727204695.21342: attempt loop complete, returning result 43681 1727204695.21350: _execute() done 43681 1727204695.21358: dumping result to json 43681 1727204695.21369: done dumping result, returning 43681 1727204695.21380: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [12b410aa-8751-9e86-7728-000000000007] 43681 1727204695.21392: sending task result for task 12b410aa-8751-9e86-7728-000000000007 43681 1727204695.21717: done sending task result for task 12b410aa-8751-9e86-7728-000000000007 43681 1727204695.21721: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 43681 1727204695.21768: no more pending results, returning what we have 43681 1727204695.21771: results queue empty 43681 1727204695.21772: checking for any_errors_fatal 43681 1727204695.21778: done checking for any_errors_fatal 43681 1727204695.21779: checking for max_fail_percentage 43681 1727204695.21780: done checking for max_fail_percentage 43681 1727204695.21781: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.21782: done checking to see if all hosts have failed 43681 1727204695.21783: getting the remaining hosts for this loop 43681 1727204695.21784: done getting the remaining hosts for this loop 43681 1727204695.21788: getting the next task for host managed-node3 43681 1727204695.21797: done getting next task for host managed-node3 43681 1727204695.21799: ^ task is: TASK: meta (flush_handlers) 43681 1727204695.21802: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.21806: getting variables 43681 1727204695.21808: in VariableManager get_vars() 43681 1727204695.21836: Calling all_inventory to load vars for managed-node3 43681 1727204695.21839: Calling groups_inventory to load vars for managed-node3 43681 1727204695.21843: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.21853: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.21857: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.21861: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.22149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.22499: done with get_vars() 43681 1727204695.22510: done getting variables 43681 1727204695.22585: in VariableManager get_vars() 43681 1727204695.22598: Calling all_inventory to load vars for managed-node3 43681 1727204695.22601: Calling groups_inventory to load vars for managed-node3 43681 1727204695.22604: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.22610: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.22613: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.22617: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.22834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.23145: done with get_vars() 43681 1727204695.23162: done queuing things up, now waiting for results queue to drain 43681 1727204695.23164: results queue empty 43681 1727204695.23165: checking for any_errors_fatal 43681 1727204695.23168: done checking for any_errors_fatal 43681 1727204695.23169: checking for max_fail_percentage 43681 1727204695.23170: done checking for max_fail_percentage 43681 1727204695.23171: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.23172: done checking to see if all hosts have failed 43681 1727204695.23173: getting the remaining hosts for this loop 43681 1727204695.23174: done getting the remaining hosts for this loop 43681 1727204695.23177: getting the next task for host managed-node3 43681 1727204695.23182: done getting next task for host managed-node3 43681 1727204695.23184: ^ task is: TASK: meta (flush_handlers) 43681 1727204695.23185: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.23197: getting variables 43681 1727204695.23198: in VariableManager get_vars() 43681 1727204695.23208: Calling all_inventory to load vars for managed-node3 43681 1727204695.23211: Calling groups_inventory to load vars for managed-node3 43681 1727204695.23213: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.23218: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.23221: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.23225: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.23426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.23706: done with get_vars() 43681 1727204695.23715: done getting variables 43681 1727204695.23767: in VariableManager get_vars() 43681 1727204695.23778: Calling all_inventory to load vars for managed-node3 43681 1727204695.23781: Calling groups_inventory to load vars for managed-node3 43681 1727204695.23784: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.23793: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.23796: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.23800: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.24006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.24301: done with get_vars() 43681 1727204695.24316: done queuing things up, now waiting for results queue to drain 43681 1727204695.24318: results queue empty 43681 1727204695.24319: checking for any_errors_fatal 43681 1727204695.24320: done checking for any_errors_fatal 43681 1727204695.24321: checking for max_fail_percentage 43681 1727204695.24323: done checking for max_fail_percentage 43681 1727204695.24324: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.24325: done checking to see if all hosts have failed 43681 1727204695.24326: getting the remaining hosts for this loop 43681 1727204695.24327: done getting the remaining hosts for this loop 43681 1727204695.24330: getting the next task for host managed-node3 43681 1727204695.24334: done getting next task for host managed-node3 43681 1727204695.24335: ^ task is: None 43681 1727204695.24336: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.24338: done queuing things up, now waiting for results queue to drain 43681 1727204695.24339: results queue empty 43681 1727204695.24340: checking for any_errors_fatal 43681 1727204695.24341: done checking for any_errors_fatal 43681 1727204695.24342: checking for max_fail_percentage 43681 1727204695.24343: done checking for max_fail_percentage 43681 1727204695.24344: checking to see if all hosts have failed and the running result is not ok 43681 1727204695.24345: done checking to see if all hosts have failed 43681 1727204695.24347: getting the next task for host managed-node3 43681 1727204695.24351: done getting next task for host managed-node3 43681 1727204695.24352: ^ task is: None 43681 1727204695.24354: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.24415: in VariableManager get_vars() 43681 1727204695.24442: done with get_vars() 43681 1727204695.24450: in VariableManager get_vars() 43681 1727204695.24465: done with get_vars() 43681 1727204695.24471: variable 'omit' from source: magic vars 43681 1727204695.24513: in VariableManager get_vars() 43681 1727204695.24531: done with get_vars() 43681 1727204695.24557: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 43681 1727204695.24949: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 43681 1727204695.24978: getting the remaining hosts for this loop 43681 1727204695.24980: done getting the remaining hosts for this loop 43681 1727204695.24983: getting the next task for host managed-node3 43681 1727204695.24987: done getting next task for host managed-node3 43681 1727204695.24993: ^ task is: TASK: Gathering Facts 43681 1727204695.24995: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204695.24997: getting variables 43681 1727204695.24998: in VariableManager get_vars() 43681 1727204695.25012: Calling all_inventory to load vars for managed-node3 43681 1727204695.25015: Calling groups_inventory to load vars for managed-node3 43681 1727204695.25017: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204695.25023: Calling all_plugins_play to load vars for managed-node3 43681 1727204695.25040: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204695.25045: Calling groups_plugins_play to load vars for managed-node3 43681 1727204695.25306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204695.25607: done with get_vars() 43681 1727204695.25617: done getting variables 43681 1727204695.25668: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.062) 0:00:02.923 ***** 43681 1727204695.25699: entering _queue_task() for managed-node3/gather_facts 43681 1727204695.26016: worker is 1 (out of 1 available) 43681 1727204695.26028: exiting _queue_task() for managed-node3/gather_facts 43681 1727204695.26041: done queuing things up, now waiting for results queue to drain 43681 1727204695.26043: waiting for pending results... 43681 1727204695.26315: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204695.26495: in run() - task 12b410aa-8751-9e86-7728-00000000010b 43681 1727204695.26500: variable 'ansible_search_path' from source: unknown 43681 1727204695.26503: calling self._execute() 43681 1727204695.26592: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.26607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.26629: variable 'omit' from source: magic vars 43681 1727204695.27075: variable 'ansible_distribution_major_version' from source: facts 43681 1727204695.27097: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204695.27108: variable 'omit' from source: magic vars 43681 1727204695.27169: variable 'omit' from source: magic vars 43681 1727204695.27185: variable 'omit' from source: magic vars 43681 1727204695.27232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204695.27281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204695.27311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204695.27396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204695.27400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204695.27403: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204695.27405: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.27413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.27538: Set connection var ansible_shell_type to sh 43681 1727204695.27553: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204695.27565: Set connection var ansible_timeout to 10 43681 1727204695.27579: Set connection var ansible_pipelining to False 43681 1727204695.27592: Set connection var ansible_connection to ssh 43681 1727204695.27629: Set connection var ansible_shell_executable to /bin/sh 43681 1727204695.27636: variable 'ansible_shell_executable' from source: unknown 43681 1727204695.27644: variable 'ansible_connection' from source: unknown 43681 1727204695.27652: variable 'ansible_module_compression' from source: unknown 43681 1727204695.27659: variable 'ansible_shell_type' from source: unknown 43681 1727204695.27666: variable 'ansible_shell_executable' from source: unknown 43681 1727204695.27738: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204695.27741: variable 'ansible_pipelining' from source: unknown 43681 1727204695.27743: variable 'ansible_timeout' from source: unknown 43681 1727204695.27745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204695.27911: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204695.27933: variable 'omit' from source: magic vars 43681 1727204695.27944: starting attempt loop 43681 1727204695.27957: running the handler 43681 1727204695.27980: variable 'ansible_facts' from source: unknown 43681 1727204695.28012: _low_level_execute_command(): starting 43681 1727204695.28027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204695.28793: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204695.28811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204695.28827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.28847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204695.28884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.28982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204695.29016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204695.29093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204695.30839: stdout chunk (state=3): >>>/root <<< 43681 1727204695.31052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204695.31056: stdout chunk (state=3): >>><<< 43681 1727204695.31058: stderr chunk (state=3): >>><<< 43681 1727204695.31085: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204695.31108: _low_level_execute_command(): starting 43681 1727204695.31196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964 `" && echo ansible-tmp-1727204695.3109388-43909-250228474910964="` echo /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964 `" ) && sleep 0' 43681 1727204695.31778: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204695.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204695.31806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.31872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204695.31931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204695.31951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204695.31987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204695.32076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204695.34094: stdout chunk (state=3): >>>ansible-tmp-1727204695.3109388-43909-250228474910964=/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964 <<< 43681 1727204695.34213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204695.34296: stderr chunk (state=3): >>><<< 43681 1727204695.34300: stdout chunk (state=3): >>><<< 43681 1727204695.34325: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204695.3109388-43909-250228474910964=/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204695.34495: variable 'ansible_module_compression' from source: unknown 43681 1727204695.34499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204695.34503: variable 'ansible_facts' from source: unknown 43681 1727204695.34688: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py 43681 1727204695.34919: Sending initial data 43681 1727204695.34931: Sent initial data (154 bytes) 43681 1727204695.35547: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204695.35563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204695.35577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.35603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204695.35708: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204695.35731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204695.35798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204695.37456: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204695.37506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204695.37569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp8n3xl8qe /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py <<< 43681 1727204695.37582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py" <<< 43681 1727204695.37599: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp8n3xl8qe" to remote "/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py" <<< 43681 1727204695.40065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204695.40105: stderr chunk (state=3): >>><<< 43681 1727204695.40115: stdout chunk (state=3): >>><<< 43681 1727204695.40151: done transferring module to remote 43681 1727204695.40169: _low_level_execute_command(): starting 43681 1727204695.40184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/ /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py && sleep 0' 43681 1727204695.40861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204695.40876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204695.40893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.40912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204695.40933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204695.41054: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204695.41075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204695.41143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204695.42987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204695.43077: stderr chunk (state=3): >>><<< 43681 1727204695.43081: stdout chunk (state=3): >>><<< 43681 1727204695.43193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204695.43197: _low_level_execute_command(): starting 43681 1727204695.43200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/AnsiballZ_setup.py && sleep 0' 43681 1727204695.43765: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204695.43780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204695.43850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204695.43853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204695.43926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204695.43955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204695.43982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204695.44056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.13558: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memt<<< 43681 1727204696.13598: stdout chunk (state=3): >>>otal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3485, "used": 232}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1200, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148410880, "block_size": 4096, "block_total": 64479564, "block_available": 61315530, "block_used": 3164034, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "56", "epoch": "1727204696", "epoch_int": "1727204696", "date": "2024-09-24", "time": "15:04:56", "iso8601_micro": "2024-09-24T19:04:56.093682Z", "iso8601": "2024-09-24T19:04:56Z", "iso8601_basic": "20240924T150456093682", "iso8601_basic_short": "20240924T150456", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.95556640625, "5m": 0.88232421875, "15m": 0.55126953125}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-<<< 43681 1727204696.13637: stdout chunk (state=3): >>>100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tc<<< 43681 1727204696.13642: stdout chunk (state=3): >>>p_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204696.15873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204696.15935: stderr chunk (state=3): >>><<< 43681 1727204696.15939: stdout chunk (state=3): >>><<< 43681 1727204696.15971: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3485, "used": 232}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1200, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148410880, "block_size": 4096, "block_total": 64479564, "block_available": 61315530, "block_used": 3164034, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "56", "epoch": "1727204696", "epoch_int": "1727204696", "date": "2024-09-24", "time": "15:04:56", "iso8601_micro": "2024-09-24T19:04:56.093682Z", "iso8601": "2024-09-24T19:04:56Z", "iso8601_basic": "20240924T150456093682", "iso8601_basic_short": "20240924T150456", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.95556640625, "5m": 0.88232421875, "15m": 0.55126953125}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204696.16298: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204696.16319: _low_level_execute_command(): starting 43681 1727204696.16322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204695.3109388-43909-250228474910964/ > /dev/null 2>&1 && sleep 0' 43681 1727204696.16773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204696.16777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.16781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204696.16784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.16839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.16843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.16886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.18918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.18934: stderr chunk (state=3): >>><<< 43681 1727204696.18946: stdout chunk (state=3): >>><<< 43681 1727204696.18969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204696.18985: handler run complete 43681 1727204696.19170: variable 'ansible_facts' from source: unknown 43681 1727204696.19276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.19558: variable 'ansible_facts' from source: unknown 43681 1727204696.19638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.19758: attempt loop complete, returning result 43681 1727204696.19762: _execute() done 43681 1727204696.19767: dumping result to json 43681 1727204696.19797: done dumping result, returning 43681 1727204696.19805: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-00000000010b] 43681 1727204696.19811: sending task result for task 12b410aa-8751-9e86-7728-00000000010b 43681 1727204696.20125: done sending task result for task 12b410aa-8751-9e86-7728-00000000010b 43681 1727204696.20128: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204696.20419: no more pending results, returning what we have 43681 1727204696.20422: results queue empty 43681 1727204696.20423: checking for any_errors_fatal 43681 1727204696.20424: done checking for any_errors_fatal 43681 1727204696.20424: checking for max_fail_percentage 43681 1727204696.20425: done checking for max_fail_percentage 43681 1727204696.20426: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.20427: done checking to see if all hosts have failed 43681 1727204696.20427: getting the remaining hosts for this loop 43681 1727204696.20428: done getting the remaining hosts for this loop 43681 1727204696.20431: getting the next task for host managed-node3 43681 1727204696.20435: done getting next task for host managed-node3 43681 1727204696.20436: ^ task is: TASK: meta (flush_handlers) 43681 1727204696.20438: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.20441: getting variables 43681 1727204696.20442: in VariableManager get_vars() 43681 1727204696.20464: Calling all_inventory to load vars for managed-node3 43681 1727204696.20466: Calling groups_inventory to load vars for managed-node3 43681 1727204696.20468: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.20476: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.20478: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.20480: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.20623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.20876: done with get_vars() 43681 1727204696.20892: done getting variables 43681 1727204696.20980: in VariableManager get_vars() 43681 1727204696.21000: Calling all_inventory to load vars for managed-node3 43681 1727204696.21003: Calling groups_inventory to load vars for managed-node3 43681 1727204696.21006: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.21011: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.21014: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.21017: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.21236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.21549: done with get_vars() 43681 1727204696.21566: done queuing things up, now waiting for results queue to drain 43681 1727204696.21568: results queue empty 43681 1727204696.21569: checking for any_errors_fatal 43681 1727204696.21573: done checking for any_errors_fatal 43681 1727204696.21574: checking for max_fail_percentage 43681 1727204696.21576: done checking for max_fail_percentage 43681 1727204696.21576: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.21581: done checking to see if all hosts have failed 43681 1727204696.21583: getting the remaining hosts for this loop 43681 1727204696.21584: done getting the remaining hosts for this loop 43681 1727204696.21586: getting the next task for host managed-node3 43681 1727204696.21592: done getting next task for host managed-node3 43681 1727204696.21595: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 43681 1727204696.21597: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.21599: getting variables 43681 1727204696.21600: in VariableManager get_vars() 43681 1727204696.21613: Calling all_inventory to load vars for managed-node3 43681 1727204696.21616: Calling groups_inventory to load vars for managed-node3 43681 1727204696.21619: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.21625: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.21628: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.21631: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.21845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.22156: done with get_vars() 43681 1727204696.22168: done getting variables 43681 1727204696.22224: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204696.22387: variable 'type' from source: play vars 43681 1727204696.22395: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.967) 0:00:03.891 ***** 43681 1727204696.22442: entering _queue_task() for managed-node3/set_fact 43681 1727204696.22766: worker is 1 (out of 1 available) 43681 1727204696.22778: exiting _queue_task() for managed-node3/set_fact 43681 1727204696.22896: done queuing things up, now waiting for results queue to drain 43681 1727204696.22899: waiting for pending results... 43681 1727204696.23228: running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 43681 1727204696.23240: in run() - task 12b410aa-8751-9e86-7728-00000000000b 43681 1727204696.23244: variable 'ansible_search_path' from source: unknown 43681 1727204696.23247: calling self._execute() 43681 1727204696.23495: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.23499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.23502: variable 'omit' from source: magic vars 43681 1727204696.23795: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.23816: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.23830: variable 'omit' from source: magic vars 43681 1727204696.23857: variable 'omit' from source: magic vars 43681 1727204696.23905: variable 'type' from source: play vars 43681 1727204696.23996: variable 'type' from source: play vars 43681 1727204696.24012: variable 'interface' from source: play vars 43681 1727204696.24092: variable 'interface' from source: play vars 43681 1727204696.24118: variable 'omit' from source: magic vars 43681 1727204696.24166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204696.24220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204696.24249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204696.24275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.24297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.24345: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204696.24357: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.24367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.24535: Set connection var ansible_shell_type to sh 43681 1727204696.24539: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204696.24542: Set connection var ansible_timeout to 10 43681 1727204696.24548: Set connection var ansible_pipelining to False 43681 1727204696.24562: Set connection var ansible_connection to ssh 43681 1727204696.24573: Set connection var ansible_shell_executable to /bin/sh 43681 1727204696.24593: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.24597: variable 'ansible_connection' from source: unknown 43681 1727204696.24602: variable 'ansible_module_compression' from source: unknown 43681 1727204696.24605: variable 'ansible_shell_type' from source: unknown 43681 1727204696.24609: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.24612: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.24618: variable 'ansible_pipelining' from source: unknown 43681 1727204696.24630: variable 'ansible_timeout' from source: unknown 43681 1727204696.24635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.24768: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204696.24777: variable 'omit' from source: magic vars 43681 1727204696.24784: starting attempt loop 43681 1727204696.24787: running the handler 43681 1727204696.24801: handler run complete 43681 1727204696.24810: attempt loop complete, returning result 43681 1727204696.24813: _execute() done 43681 1727204696.24817: dumping result to json 43681 1727204696.24824: done dumping result, returning 43681 1727204696.24830: done running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 [12b410aa-8751-9e86-7728-00000000000b] 43681 1727204696.24836: sending task result for task 12b410aa-8751-9e86-7728-00000000000b 43681 1727204696.24924: done sending task result for task 12b410aa-8751-9e86-7728-00000000000b 43681 1727204696.24927: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 43681 1727204696.25100: no more pending results, returning what we have 43681 1727204696.25103: results queue empty 43681 1727204696.25104: checking for any_errors_fatal 43681 1727204696.25105: done checking for any_errors_fatal 43681 1727204696.25106: checking for max_fail_percentage 43681 1727204696.25108: done checking for max_fail_percentage 43681 1727204696.25109: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.25110: done checking to see if all hosts have failed 43681 1727204696.25111: getting the remaining hosts for this loop 43681 1727204696.25112: done getting the remaining hosts for this loop 43681 1727204696.25116: getting the next task for host managed-node3 43681 1727204696.25120: done getting next task for host managed-node3 43681 1727204696.25123: ^ task is: TASK: Include the task 'show_interfaces.yml' 43681 1727204696.25126: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.25133: getting variables 43681 1727204696.25135: in VariableManager get_vars() 43681 1727204696.25155: Calling all_inventory to load vars for managed-node3 43681 1727204696.25156: Calling groups_inventory to load vars for managed-node3 43681 1727204696.25158: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.25165: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.25168: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.25170: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.25308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.25481: done with get_vars() 43681 1727204696.25491: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.031) 0:00:03.922 ***** 43681 1727204696.25559: entering _queue_task() for managed-node3/include_tasks 43681 1727204696.25759: worker is 1 (out of 1 available) 43681 1727204696.25774: exiting _queue_task() for managed-node3/include_tasks 43681 1727204696.25788: done queuing things up, now waiting for results queue to drain 43681 1727204696.25792: waiting for pending results... 43681 1727204696.25942: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 43681 1727204696.26005: in run() - task 12b410aa-8751-9e86-7728-00000000000c 43681 1727204696.26020: variable 'ansible_search_path' from source: unknown 43681 1727204696.26053: calling self._execute() 43681 1727204696.26123: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.26134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.26145: variable 'omit' from source: magic vars 43681 1727204696.26451: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.26495: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.26498: _execute() done 43681 1727204696.26501: dumping result to json 43681 1727204696.26503: done dumping result, returning 43681 1727204696.26695: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-9e86-7728-00000000000c] 43681 1727204696.26699: sending task result for task 12b410aa-8751-9e86-7728-00000000000c 43681 1727204696.26765: done sending task result for task 12b410aa-8751-9e86-7728-00000000000c 43681 1727204696.26769: WORKER PROCESS EXITING 43681 1727204696.26796: no more pending results, returning what we have 43681 1727204696.26801: in VariableManager get_vars() 43681 1727204696.26838: Calling all_inventory to load vars for managed-node3 43681 1727204696.26841: Calling groups_inventory to load vars for managed-node3 43681 1727204696.26844: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.26855: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.26858: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.26862: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.27131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.27443: done with get_vars() 43681 1727204696.27453: variable 'ansible_search_path' from source: unknown 43681 1727204696.27466: we have included files to process 43681 1727204696.27467: generating all_blocks data 43681 1727204696.27469: done generating all_blocks data 43681 1727204696.27470: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204696.27471: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204696.27474: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204696.27658: in VariableManager get_vars() 43681 1727204696.27676: done with get_vars() 43681 1727204696.27764: done processing included file 43681 1727204696.27767: iterating over new_blocks loaded from include file 43681 1727204696.27768: in VariableManager get_vars() 43681 1727204696.27780: done with get_vars() 43681 1727204696.27781: filtering new block on tags 43681 1727204696.27796: done filtering new block on tags 43681 1727204696.27798: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 43681 1727204696.27801: extending task lists for all hosts with included blocks 43681 1727204696.28743: done extending task lists 43681 1727204696.28745: done processing included files 43681 1727204696.28746: results queue empty 43681 1727204696.28747: checking for any_errors_fatal 43681 1727204696.28751: done checking for any_errors_fatal 43681 1727204696.28751: checking for max_fail_percentage 43681 1727204696.28752: done checking for max_fail_percentage 43681 1727204696.28753: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.28753: done checking to see if all hosts have failed 43681 1727204696.28754: getting the remaining hosts for this loop 43681 1727204696.28755: done getting the remaining hosts for this loop 43681 1727204696.28757: getting the next task for host managed-node3 43681 1727204696.28760: done getting next task for host managed-node3 43681 1727204696.28761: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 43681 1727204696.28763: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.28765: getting variables 43681 1727204696.28766: in VariableManager get_vars() 43681 1727204696.28795: Calling all_inventory to load vars for managed-node3 43681 1727204696.28797: Calling groups_inventory to load vars for managed-node3 43681 1727204696.28799: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.28803: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.28805: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.28807: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.28929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.29100: done with get_vars() 43681 1727204696.29108: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.036) 0:00:03.958 ***** 43681 1727204696.29164: entering _queue_task() for managed-node3/include_tasks 43681 1727204696.29366: worker is 1 (out of 1 available) 43681 1727204696.29380: exiting _queue_task() for managed-node3/include_tasks 43681 1727204696.29393: done queuing things up, now waiting for results queue to drain 43681 1727204696.29395: waiting for pending results... 43681 1727204696.29554: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 43681 1727204696.29628: in run() - task 12b410aa-8751-9e86-7728-000000000121 43681 1727204696.29637: variable 'ansible_search_path' from source: unknown 43681 1727204696.29641: variable 'ansible_search_path' from source: unknown 43681 1727204696.29672: calling self._execute() 43681 1727204696.29844: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.29850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.29853: variable 'omit' from source: magic vars 43681 1727204696.30056: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.30077: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.30081: _execute() done 43681 1727204696.30085: dumping result to json 43681 1727204696.30087: done dumping result, returning 43681 1727204696.30092: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9e86-7728-000000000121] 43681 1727204696.30095: sending task result for task 12b410aa-8751-9e86-7728-000000000121 43681 1727204696.30188: done sending task result for task 12b410aa-8751-9e86-7728-000000000121 43681 1727204696.30192: WORKER PROCESS EXITING 43681 1727204696.30221: no more pending results, returning what we have 43681 1727204696.30226: in VariableManager get_vars() 43681 1727204696.30263: Calling all_inventory to load vars for managed-node3 43681 1727204696.30266: Calling groups_inventory to load vars for managed-node3 43681 1727204696.30268: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.30278: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.30281: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.30284: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.30451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.30646: done with get_vars() 43681 1727204696.30653: variable 'ansible_search_path' from source: unknown 43681 1727204696.30654: variable 'ansible_search_path' from source: unknown 43681 1727204696.30681: we have included files to process 43681 1727204696.30682: generating all_blocks data 43681 1727204696.30683: done generating all_blocks data 43681 1727204696.30684: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204696.30685: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204696.30687: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204696.30934: done processing included file 43681 1727204696.30936: iterating over new_blocks loaded from include file 43681 1727204696.30937: in VariableManager get_vars() 43681 1727204696.30951: done with get_vars() 43681 1727204696.30953: filtering new block on tags 43681 1727204696.30966: done filtering new block on tags 43681 1727204696.30967: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 43681 1727204696.30971: extending task lists for all hosts with included blocks 43681 1727204696.31050: done extending task lists 43681 1727204696.31051: done processing included files 43681 1727204696.31052: results queue empty 43681 1727204696.31053: checking for any_errors_fatal 43681 1727204696.31055: done checking for any_errors_fatal 43681 1727204696.31056: checking for max_fail_percentage 43681 1727204696.31057: done checking for max_fail_percentage 43681 1727204696.31058: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.31059: done checking to see if all hosts have failed 43681 1727204696.31060: getting the remaining hosts for this loop 43681 1727204696.31061: done getting the remaining hosts for this loop 43681 1727204696.31063: getting the next task for host managed-node3 43681 1727204696.31066: done getting next task for host managed-node3 43681 1727204696.31068: ^ task is: TASK: Gather current interface info 43681 1727204696.31070: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.31072: getting variables 43681 1727204696.31073: in VariableManager get_vars() 43681 1727204696.31081: Calling all_inventory to load vars for managed-node3 43681 1727204696.31082: Calling groups_inventory to load vars for managed-node3 43681 1727204696.31084: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.31088: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.31092: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.31094: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.31218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.31390: done with get_vars() 43681 1727204696.31398: done getting variables 43681 1727204696.31430: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.022) 0:00:03.981 ***** 43681 1727204696.31452: entering _queue_task() for managed-node3/command 43681 1727204696.31652: worker is 1 (out of 1 available) 43681 1727204696.31666: exiting _queue_task() for managed-node3/command 43681 1727204696.31679: done queuing things up, now waiting for results queue to drain 43681 1727204696.31681: waiting for pending results... 43681 1727204696.31839: running TaskExecutor() for managed-node3/TASK: Gather current interface info 43681 1727204696.31925: in run() - task 12b410aa-8751-9e86-7728-0000000001b0 43681 1727204696.31938: variable 'ansible_search_path' from source: unknown 43681 1727204696.31942: variable 'ansible_search_path' from source: unknown 43681 1727204696.31974: calling self._execute() 43681 1727204696.32043: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.32050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.32060: variable 'omit' from source: magic vars 43681 1727204696.32614: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.32624: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.32632: variable 'omit' from source: magic vars 43681 1727204696.32670: variable 'omit' from source: magic vars 43681 1727204696.32701: variable 'omit' from source: magic vars 43681 1727204696.32734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204696.32768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204696.32787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204696.32807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.32819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.32845: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204696.32848: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.32853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.32941: Set connection var ansible_shell_type to sh 43681 1727204696.32948: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204696.32955: Set connection var ansible_timeout to 10 43681 1727204696.32963: Set connection var ansible_pipelining to False 43681 1727204696.32969: Set connection var ansible_connection to ssh 43681 1727204696.32976: Set connection var ansible_shell_executable to /bin/sh 43681 1727204696.33001: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.33005: variable 'ansible_connection' from source: unknown 43681 1727204696.33008: variable 'ansible_module_compression' from source: unknown 43681 1727204696.33012: variable 'ansible_shell_type' from source: unknown 43681 1727204696.33018: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.33021: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.33023: variable 'ansible_pipelining' from source: unknown 43681 1727204696.33027: variable 'ansible_timeout' from source: unknown 43681 1727204696.33029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.33150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204696.33156: variable 'omit' from source: magic vars 43681 1727204696.33163: starting attempt loop 43681 1727204696.33166: running the handler 43681 1727204696.33180: _low_level_execute_command(): starting 43681 1727204696.33187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204696.33732: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204696.33736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.33740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204696.33743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.33797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.33800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.33850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.35642: stdout chunk (state=3): >>>/root <<< 43681 1727204696.35759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.35809: stderr chunk (state=3): >>><<< 43681 1727204696.35812: stdout chunk (state=3): >>><<< 43681 1727204696.35836: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204696.35849: _low_level_execute_command(): starting 43681 1727204696.35857: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590 `" && echo ansible-tmp-1727204696.3583503-43942-225224910972590="` echo /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590 `" ) && sleep 0' 43681 1727204696.36323: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204696.36326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204696.36329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204696.36338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204696.36341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.36397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.36400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.36439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.38503: stdout chunk (state=3): >>>ansible-tmp-1727204696.3583503-43942-225224910972590=/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590 <<< 43681 1727204696.38622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.38669: stderr chunk (state=3): >>><<< 43681 1727204696.38672: stdout chunk (state=3): >>><<< 43681 1727204696.38687: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204696.3583503-43942-225224910972590=/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204696.38718: variable 'ansible_module_compression' from source: unknown 43681 1727204696.38763: ANSIBALLZ: Using generic lock for ansible.legacy.command 43681 1727204696.38767: ANSIBALLZ: Acquiring lock 43681 1727204696.38769: ANSIBALLZ: Lock acquired: 140156138759584 43681 1727204696.38773: ANSIBALLZ: Creating module 43681 1727204696.49527: ANSIBALLZ: Writing module into payload 43681 1727204696.49607: ANSIBALLZ: Writing module 43681 1727204696.49633: ANSIBALLZ: Renaming module 43681 1727204696.49640: ANSIBALLZ: Done creating module 43681 1727204696.49655: variable 'ansible_facts' from source: unknown 43681 1727204696.49704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py 43681 1727204696.49825: Sending initial data 43681 1727204696.49829: Sent initial data (156 bytes) 43681 1727204696.50325: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204696.50329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204696.50331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204696.50334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204696.50336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.50394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.50398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.50448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.52169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204696.52173: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204696.52206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204696.52250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpc8pp5gg4 /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py <<< 43681 1727204696.52258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py" <<< 43681 1727204696.52278: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpc8pp5gg4" to remote "/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py" <<< 43681 1727204696.52281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py" <<< 43681 1727204696.53026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.53094: stderr chunk (state=3): >>><<< 43681 1727204696.53098: stdout chunk (state=3): >>><<< 43681 1727204696.53124: done transferring module to remote 43681 1727204696.53142: _low_level_execute_command(): starting 43681 1727204696.53145: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/ /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py && sleep 0' 43681 1727204696.53612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204696.53615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.53621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204696.53623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204696.53630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.53677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.53682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.53717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.55585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.55640: stderr chunk (state=3): >>><<< 43681 1727204696.55644: stdout chunk (state=3): >>><<< 43681 1727204696.55660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204696.55664: _low_level_execute_command(): starting 43681 1727204696.55670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/AnsiballZ_command.py && sleep 0' 43681 1727204696.56134: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204696.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204696.56142: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.56145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204696.56147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.56200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.56205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.56248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.73795: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:56.733120", "end": "2024-09-24 15:04:56.736630", "delta": "0:00:00.003510", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204696.75747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204696.75755: stderr chunk (state=3): >>><<< 43681 1727204696.75765: stdout chunk (state=3): >>><<< 43681 1727204696.75818: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:56.733120", "end": "2024-09-24 15:04:56.736630", "delta": "0:00:00.003510", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204696.75855: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204696.75874: _low_level_execute_command(): starting 43681 1727204696.75884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204696.3583503-43942-225224910972590/ > /dev/null 2>&1 && sleep 0' 43681 1727204696.76581: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204696.76691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204696.76712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204696.76746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204696.76770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204696.76788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204696.76857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204696.78995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204696.78999: stdout chunk (state=3): >>><<< 43681 1727204696.79001: stderr chunk (state=3): >>><<< 43681 1727204696.79004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204696.79006: handler run complete 43681 1727204696.79009: Evaluated conditional (False): False 43681 1727204696.79011: attempt loop complete, returning result 43681 1727204696.79013: _execute() done 43681 1727204696.79015: dumping result to json 43681 1727204696.79017: done dumping result, returning 43681 1727204696.79019: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-9e86-7728-0000000001b0] 43681 1727204696.79034: sending task result for task 12b410aa-8751-9e86-7728-0000000001b0 43681 1727204696.79336: done sending task result for task 12b410aa-8751-9e86-7728-0000000001b0 43681 1727204696.79339: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003510", "end": "2024-09-24 15:04:56.736630", "rc": 0, "start": "2024-09-24 15:04:56.733120" } STDOUT: bonding_masters eth0 lo 43681 1727204696.80057: no more pending results, returning what we have 43681 1727204696.80061: results queue empty 43681 1727204696.80062: checking for any_errors_fatal 43681 1727204696.80063: done checking for any_errors_fatal 43681 1727204696.80064: checking for max_fail_percentage 43681 1727204696.80066: done checking for max_fail_percentage 43681 1727204696.80067: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.80068: done checking to see if all hosts have failed 43681 1727204696.80069: getting the remaining hosts for this loop 43681 1727204696.80070: done getting the remaining hosts for this loop 43681 1727204696.80074: getting the next task for host managed-node3 43681 1727204696.80080: done getting next task for host managed-node3 43681 1727204696.80083: ^ task is: TASK: Set current_interfaces 43681 1727204696.80088: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.80095: getting variables 43681 1727204696.80097: in VariableManager get_vars() 43681 1727204696.80129: Calling all_inventory to load vars for managed-node3 43681 1727204696.80132: Calling groups_inventory to load vars for managed-node3 43681 1727204696.80135: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.80152: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.80155: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.80159: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.80402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.80745: done with get_vars() 43681 1727204696.80759: done getting variables 43681 1727204696.80837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.494) 0:00:04.475 ***** 43681 1727204696.80871: entering _queue_task() for managed-node3/set_fact 43681 1727204696.81296: worker is 1 (out of 1 available) 43681 1727204696.81308: exiting _queue_task() for managed-node3/set_fact 43681 1727204696.81321: done queuing things up, now waiting for results queue to drain 43681 1727204696.81323: waiting for pending results... 43681 1727204696.81575: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 43681 1727204696.81678: in run() - task 12b410aa-8751-9e86-7728-0000000001b1 43681 1727204696.81781: variable 'ansible_search_path' from source: unknown 43681 1727204696.81786: variable 'ansible_search_path' from source: unknown 43681 1727204696.81791: calling self._execute() 43681 1727204696.81854: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.81870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.81895: variable 'omit' from source: magic vars 43681 1727204696.82895: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.82899: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.82902: variable 'omit' from source: magic vars 43681 1727204696.82904: variable 'omit' from source: magic vars 43681 1727204696.82953: variable '_current_interfaces' from source: set_fact 43681 1727204696.83137: variable 'omit' from source: magic vars 43681 1727204696.83264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204696.83316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204696.83394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204696.83587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.83591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.83594: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204696.83596: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.83598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.83792: Set connection var ansible_shell_type to sh 43681 1727204696.83907: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204696.83995: Set connection var ansible_timeout to 10 43681 1727204696.83998: Set connection var ansible_pipelining to False 43681 1727204696.84001: Set connection var ansible_connection to ssh 43681 1727204696.84003: Set connection var ansible_shell_executable to /bin/sh 43681 1727204696.84005: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.84008: variable 'ansible_connection' from source: unknown 43681 1727204696.84010: variable 'ansible_module_compression' from source: unknown 43681 1727204696.84012: variable 'ansible_shell_type' from source: unknown 43681 1727204696.84014: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.84016: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.84132: variable 'ansible_pipelining' from source: unknown 43681 1727204696.84136: variable 'ansible_timeout' from source: unknown 43681 1727204696.84138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.84568: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204696.84572: variable 'omit' from source: magic vars 43681 1727204696.84574: starting attempt loop 43681 1727204696.84576: running the handler 43681 1727204696.84578: handler run complete 43681 1727204696.84581: attempt loop complete, returning result 43681 1727204696.84583: _execute() done 43681 1727204696.84585: dumping result to json 43681 1727204696.84587: done dumping result, returning 43681 1727204696.84592: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-9e86-7728-0000000001b1] 43681 1727204696.84784: sending task result for task 12b410aa-8751-9e86-7728-0000000001b1 43681 1727204696.84857: done sending task result for task 12b410aa-8751-9e86-7728-0000000001b1 43681 1727204696.84860: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 43681 1727204696.84957: no more pending results, returning what we have 43681 1727204696.84960: results queue empty 43681 1727204696.84962: checking for any_errors_fatal 43681 1727204696.84973: done checking for any_errors_fatal 43681 1727204696.84974: checking for max_fail_percentage 43681 1727204696.84976: done checking for max_fail_percentage 43681 1727204696.84977: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.84978: done checking to see if all hosts have failed 43681 1727204696.84979: getting the remaining hosts for this loop 43681 1727204696.84980: done getting the remaining hosts for this loop 43681 1727204696.84985: getting the next task for host managed-node3 43681 1727204696.84997: done getting next task for host managed-node3 43681 1727204696.85000: ^ task is: TASK: Show current_interfaces 43681 1727204696.85004: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.85008: getting variables 43681 1727204696.85010: in VariableManager get_vars() 43681 1727204696.85051: Calling all_inventory to load vars for managed-node3 43681 1727204696.85055: Calling groups_inventory to load vars for managed-node3 43681 1727204696.85058: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.85071: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.85074: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.85078: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.85727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.86548: done with get_vars() 43681 1727204696.86563: done getting variables 43681 1727204696.86892: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.060) 0:00:04.535 ***** 43681 1727204696.86928: entering _queue_task() for managed-node3/debug 43681 1727204696.86931: Creating lock for debug 43681 1727204696.87641: worker is 1 (out of 1 available) 43681 1727204696.87656: exiting _queue_task() for managed-node3/debug 43681 1727204696.87671: done queuing things up, now waiting for results queue to drain 43681 1727204696.87673: waiting for pending results... 43681 1727204696.88039: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 43681 1727204696.88467: in run() - task 12b410aa-8751-9e86-7728-000000000122 43681 1727204696.88471: variable 'ansible_search_path' from source: unknown 43681 1727204696.88475: variable 'ansible_search_path' from source: unknown 43681 1727204696.88479: calling self._execute() 43681 1727204696.88643: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.88659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.88683: variable 'omit' from source: magic vars 43681 1727204696.89653: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.89713: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.89726: variable 'omit' from source: magic vars 43681 1727204696.89777: variable 'omit' from source: magic vars 43681 1727204696.89967: variable 'current_interfaces' from source: set_fact 43681 1727204696.89970: variable 'omit' from source: magic vars 43681 1727204696.89981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204696.90031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204696.90057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204696.90088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.90107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204696.90146: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204696.90155: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.90163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.90280: Set connection var ansible_shell_type to sh 43681 1727204696.90299: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204696.90311: Set connection var ansible_timeout to 10 43681 1727204696.90325: Set connection var ansible_pipelining to False 43681 1727204696.90396: Set connection var ansible_connection to ssh 43681 1727204696.90399: Set connection var ansible_shell_executable to /bin/sh 43681 1727204696.90402: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.90405: variable 'ansible_connection' from source: unknown 43681 1727204696.90407: variable 'ansible_module_compression' from source: unknown 43681 1727204696.90409: variable 'ansible_shell_type' from source: unknown 43681 1727204696.90411: variable 'ansible_shell_executable' from source: unknown 43681 1727204696.90413: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.90416: variable 'ansible_pipelining' from source: unknown 43681 1727204696.90418: variable 'ansible_timeout' from source: unknown 43681 1727204696.90426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.90585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204696.90606: variable 'omit' from source: magic vars 43681 1727204696.90622: starting attempt loop 43681 1727204696.90629: running the handler 43681 1727204696.90680: handler run complete 43681 1727204696.90705: attempt loop complete, returning result 43681 1727204696.90723: _execute() done 43681 1727204696.90726: dumping result to json 43681 1727204696.90833: done dumping result, returning 43681 1727204696.90837: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-9e86-7728-000000000122] 43681 1727204696.90839: sending task result for task 12b410aa-8751-9e86-7728-000000000122 43681 1727204696.90913: done sending task result for task 12b410aa-8751-9e86-7728-000000000122 43681 1727204696.90917: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 43681 1727204696.90995: no more pending results, returning what we have 43681 1727204696.91000: results queue empty 43681 1727204696.91001: checking for any_errors_fatal 43681 1727204696.91006: done checking for any_errors_fatal 43681 1727204696.91007: checking for max_fail_percentage 43681 1727204696.91010: done checking for max_fail_percentage 43681 1727204696.91011: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.91012: done checking to see if all hosts have failed 43681 1727204696.91013: getting the remaining hosts for this loop 43681 1727204696.91015: done getting the remaining hosts for this loop 43681 1727204696.91020: getting the next task for host managed-node3 43681 1727204696.91029: done getting next task for host managed-node3 43681 1727204696.91034: ^ task is: TASK: Include the task 'manage_test_interface.yml' 43681 1727204696.91036: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.91042: getting variables 43681 1727204696.91044: in VariableManager get_vars() 43681 1727204696.91087: Calling all_inventory to load vars for managed-node3 43681 1727204696.91195: Calling groups_inventory to load vars for managed-node3 43681 1727204696.91200: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.91212: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.91216: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.91220: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.91586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.91916: done with get_vars() 43681 1727204696.91930: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.051) 0:00:04.586 ***** 43681 1727204696.92034: entering _queue_task() for managed-node3/include_tasks 43681 1727204696.92320: worker is 1 (out of 1 available) 43681 1727204696.92335: exiting _queue_task() for managed-node3/include_tasks 43681 1727204696.92351: done queuing things up, now waiting for results queue to drain 43681 1727204696.92353: waiting for pending results... 43681 1727204696.92633: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 43681 1727204696.92798: in run() - task 12b410aa-8751-9e86-7728-00000000000d 43681 1727204696.92802: variable 'ansible_search_path' from source: unknown 43681 1727204696.92827: calling self._execute() 43681 1727204696.92925: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.92995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.93000: variable 'omit' from source: magic vars 43681 1727204696.93382: variable 'ansible_distribution_major_version' from source: facts 43681 1727204696.93402: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204696.93415: _execute() done 43681 1727204696.93423: dumping result to json 43681 1727204696.93431: done dumping result, returning 43681 1727204696.93442: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-9e86-7728-00000000000d] 43681 1727204696.93453: sending task result for task 12b410aa-8751-9e86-7728-00000000000d 43681 1727204696.93717: no more pending results, returning what we have 43681 1727204696.93722: in VariableManager get_vars() 43681 1727204696.93764: Calling all_inventory to load vars for managed-node3 43681 1727204696.93767: Calling groups_inventory to load vars for managed-node3 43681 1727204696.93770: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.93784: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.93788: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.93794: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.94158: done sending task result for task 12b410aa-8751-9e86-7728-00000000000d 43681 1727204696.94162: WORKER PROCESS EXITING 43681 1727204696.94188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.94510: done with get_vars() 43681 1727204696.94519: variable 'ansible_search_path' from source: unknown 43681 1727204696.94533: we have included files to process 43681 1727204696.94534: generating all_blocks data 43681 1727204696.94536: done generating all_blocks data 43681 1727204696.94541: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 43681 1727204696.94543: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 43681 1727204696.94545: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 43681 1727204696.95196: in VariableManager get_vars() 43681 1727204696.95219: done with get_vars() 43681 1727204696.95505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 43681 1727204696.96218: done processing included file 43681 1727204696.96220: iterating over new_blocks loaded from include file 43681 1727204696.96222: in VariableManager get_vars() 43681 1727204696.96240: done with get_vars() 43681 1727204696.96241: filtering new block on tags 43681 1727204696.96281: done filtering new block on tags 43681 1727204696.96285: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 43681 1727204696.96293: extending task lists for all hosts with included blocks 43681 1727204696.98160: done extending task lists 43681 1727204696.98161: done processing included files 43681 1727204696.98163: results queue empty 43681 1727204696.98163: checking for any_errors_fatal 43681 1727204696.98167: done checking for any_errors_fatal 43681 1727204696.98168: checking for max_fail_percentage 43681 1727204696.98170: done checking for max_fail_percentage 43681 1727204696.98170: checking to see if all hosts have failed and the running result is not ok 43681 1727204696.98171: done checking to see if all hosts have failed 43681 1727204696.98172: getting the remaining hosts for this loop 43681 1727204696.98174: done getting the remaining hosts for this loop 43681 1727204696.98177: getting the next task for host managed-node3 43681 1727204696.98181: done getting next task for host managed-node3 43681 1727204696.98184: ^ task is: TASK: Ensure state in ["present", "absent"] 43681 1727204696.98187: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204696.98191: getting variables 43681 1727204696.98192: in VariableManager get_vars() 43681 1727204696.98228: Calling all_inventory to load vars for managed-node3 43681 1727204696.98231: Calling groups_inventory to load vars for managed-node3 43681 1727204696.98234: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204696.98240: Calling all_plugins_play to load vars for managed-node3 43681 1727204696.98244: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204696.98247: Calling groups_plugins_play to load vars for managed-node3 43681 1727204696.98496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204696.98844: done with get_vars() 43681 1727204696.98855: done getting variables 43681 1727204696.98948: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.069) 0:00:04.656 ***** 43681 1727204696.98979: entering _queue_task() for managed-node3/fail 43681 1727204696.98981: Creating lock for fail 43681 1727204696.99326: worker is 1 (out of 1 available) 43681 1727204696.99340: exiting _queue_task() for managed-node3/fail 43681 1727204696.99363: done queuing things up, now waiting for results queue to drain 43681 1727204696.99366: waiting for pending results... 43681 1727204696.99558: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 43681 1727204696.99680: in run() - task 12b410aa-8751-9e86-7728-0000000001cc 43681 1727204696.99705: variable 'ansible_search_path' from source: unknown 43681 1727204696.99715: variable 'ansible_search_path' from source: unknown 43681 1727204696.99759: calling self._execute() 43681 1727204696.99846: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204696.99859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204696.99876: variable 'omit' from source: magic vars 43681 1727204697.00285: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.00306: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.00483: variable 'state' from source: include params 43681 1727204697.00498: Evaluated conditional (state not in ["present", "absent"]): False 43681 1727204697.00506: when evaluation is False, skipping this task 43681 1727204697.00514: _execute() done 43681 1727204697.00521: dumping result to json 43681 1727204697.00536: done dumping result, returning 43681 1727204697.00568: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-9e86-7728-0000000001cc] 43681 1727204697.00581: sending task result for task 12b410aa-8751-9e86-7728-0000000001cc skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 43681 1727204697.00738: no more pending results, returning what we have 43681 1727204697.00743: results queue empty 43681 1727204697.00744: checking for any_errors_fatal 43681 1727204697.00747: done checking for any_errors_fatal 43681 1727204697.00749: checking for max_fail_percentage 43681 1727204697.00750: done checking for max_fail_percentage 43681 1727204697.00751: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.00752: done checking to see if all hosts have failed 43681 1727204697.00753: getting the remaining hosts for this loop 43681 1727204697.00754: done getting the remaining hosts for this loop 43681 1727204697.00759: getting the next task for host managed-node3 43681 1727204697.00765: done getting next task for host managed-node3 43681 1727204697.00768: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 43681 1727204697.00772: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.00775: getting variables 43681 1727204697.00777: in VariableManager get_vars() 43681 1727204697.00819: Calling all_inventory to load vars for managed-node3 43681 1727204697.00822: Calling groups_inventory to load vars for managed-node3 43681 1727204697.00825: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.00839: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.00843: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.00847: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.01140: done sending task result for task 12b410aa-8751-9e86-7728-0000000001cc 43681 1727204697.01143: WORKER PROCESS EXITING 43681 1727204697.01170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.01499: done with get_vars() 43681 1727204697.01511: done getting variables 43681 1727204697.01573: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.026) 0:00:04.682 ***** 43681 1727204697.01607: entering _queue_task() for managed-node3/fail 43681 1727204697.01840: worker is 1 (out of 1 available) 43681 1727204697.01853: exiting _queue_task() for managed-node3/fail 43681 1727204697.01866: done queuing things up, now waiting for results queue to drain 43681 1727204697.01868: waiting for pending results... 43681 1727204697.02120: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 43681 1727204697.02243: in run() - task 12b410aa-8751-9e86-7728-0000000001cd 43681 1727204697.02264: variable 'ansible_search_path' from source: unknown 43681 1727204697.02297: variable 'ansible_search_path' from source: unknown 43681 1727204697.02324: calling self._execute() 43681 1727204697.02409: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.02428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.02495: variable 'omit' from source: magic vars 43681 1727204697.02856: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.02876: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.03063: variable 'type' from source: set_fact 43681 1727204697.03079: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 43681 1727204697.03087: when evaluation is False, skipping this task 43681 1727204697.03097: _execute() done 43681 1727204697.03105: dumping result to json 43681 1727204697.03112: done dumping result, returning 43681 1727204697.03294: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-9e86-7728-0000000001cd] 43681 1727204697.03298: sending task result for task 12b410aa-8751-9e86-7728-0000000001cd 43681 1727204697.03359: done sending task result for task 12b410aa-8751-9e86-7728-0000000001cd 43681 1727204697.03363: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 43681 1727204697.03402: no more pending results, returning what we have 43681 1727204697.03405: results queue empty 43681 1727204697.03406: checking for any_errors_fatal 43681 1727204697.03411: done checking for any_errors_fatal 43681 1727204697.03412: checking for max_fail_percentage 43681 1727204697.03414: done checking for max_fail_percentage 43681 1727204697.03414: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.03415: done checking to see if all hosts have failed 43681 1727204697.03416: getting the remaining hosts for this loop 43681 1727204697.03418: done getting the remaining hosts for this loop 43681 1727204697.03421: getting the next task for host managed-node3 43681 1727204697.03427: done getting next task for host managed-node3 43681 1727204697.03430: ^ task is: TASK: Include the task 'show_interfaces.yml' 43681 1727204697.03433: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.03436: getting variables 43681 1727204697.03437: in VariableManager get_vars() 43681 1727204697.03469: Calling all_inventory to load vars for managed-node3 43681 1727204697.03472: Calling groups_inventory to load vars for managed-node3 43681 1727204697.03475: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.03486: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.03492: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.03496: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.03779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.04126: done with get_vars() 43681 1727204697.04137: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.026) 0:00:04.708 ***** 43681 1727204697.04235: entering _queue_task() for managed-node3/include_tasks 43681 1727204697.04452: worker is 1 (out of 1 available) 43681 1727204697.04464: exiting _queue_task() for managed-node3/include_tasks 43681 1727204697.04476: done queuing things up, now waiting for results queue to drain 43681 1727204697.04478: waiting for pending results... 43681 1727204697.04722: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 43681 1727204697.04839: in run() - task 12b410aa-8751-9e86-7728-0000000001ce 43681 1727204697.04894: variable 'ansible_search_path' from source: unknown 43681 1727204697.04899: variable 'ansible_search_path' from source: unknown 43681 1727204697.04910: calling self._execute() 43681 1727204697.04999: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.05013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.05095: variable 'omit' from source: magic vars 43681 1727204697.05439: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.05461: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.05473: _execute() done 43681 1727204697.05482: dumping result to json 43681 1727204697.05492: done dumping result, returning 43681 1727204697.05505: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-9e86-7728-0000000001ce] 43681 1727204697.05516: sending task result for task 12b410aa-8751-9e86-7728-0000000001ce 43681 1727204697.05815: no more pending results, returning what we have 43681 1727204697.05820: in VariableManager get_vars() 43681 1727204697.05854: Calling all_inventory to load vars for managed-node3 43681 1727204697.05857: Calling groups_inventory to load vars for managed-node3 43681 1727204697.05860: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.05870: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.05873: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.05877: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.06098: done sending task result for task 12b410aa-8751-9e86-7728-0000000001ce 43681 1727204697.06102: WORKER PROCESS EXITING 43681 1727204697.06127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.06401: done with get_vars() 43681 1727204697.06409: variable 'ansible_search_path' from source: unknown 43681 1727204697.06410: variable 'ansible_search_path' from source: unknown 43681 1727204697.06446: we have included files to process 43681 1727204697.06447: generating all_blocks data 43681 1727204697.06449: done generating all_blocks data 43681 1727204697.06455: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204697.06456: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204697.06459: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 43681 1727204697.06582: in VariableManager get_vars() 43681 1727204697.06612: done with get_vars() 43681 1727204697.06746: done processing included file 43681 1727204697.06749: iterating over new_blocks loaded from include file 43681 1727204697.06751: in VariableManager get_vars() 43681 1727204697.06771: done with get_vars() 43681 1727204697.06773: filtering new block on tags 43681 1727204697.06798: done filtering new block on tags 43681 1727204697.06801: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 43681 1727204697.06807: extending task lists for all hosts with included blocks 43681 1727204697.07370: done extending task lists 43681 1727204697.07372: done processing included files 43681 1727204697.07373: results queue empty 43681 1727204697.07374: checking for any_errors_fatal 43681 1727204697.07378: done checking for any_errors_fatal 43681 1727204697.07379: checking for max_fail_percentage 43681 1727204697.07381: done checking for max_fail_percentage 43681 1727204697.07382: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.07383: done checking to see if all hosts have failed 43681 1727204697.07384: getting the remaining hosts for this loop 43681 1727204697.07385: done getting the remaining hosts for this loop 43681 1727204697.07388: getting the next task for host managed-node3 43681 1727204697.07395: done getting next task for host managed-node3 43681 1727204697.07398: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 43681 1727204697.07401: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.07404: getting variables 43681 1727204697.07405: in VariableManager get_vars() 43681 1727204697.07448: Calling all_inventory to load vars for managed-node3 43681 1727204697.07451: Calling groups_inventory to load vars for managed-node3 43681 1727204697.07454: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.07460: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.07463: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.07467: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.07675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.07983: done with get_vars() 43681 1727204697.07996: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.038) 0:00:04.747 ***** 43681 1727204697.08081: entering _queue_task() for managed-node3/include_tasks 43681 1727204697.08343: worker is 1 (out of 1 available) 43681 1727204697.08358: exiting _queue_task() for managed-node3/include_tasks 43681 1727204697.08372: done queuing things up, now waiting for results queue to drain 43681 1727204697.08374: waiting for pending results... 43681 1727204697.08648: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 43681 1727204697.08781: in run() - task 12b410aa-8751-9e86-7728-000000000275 43681 1727204697.08806: variable 'ansible_search_path' from source: unknown 43681 1727204697.08821: variable 'ansible_search_path' from source: unknown 43681 1727204697.08865: calling self._execute() 43681 1727204697.08958: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.08973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.08994: variable 'omit' from source: magic vars 43681 1727204697.09416: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.09435: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.09448: _execute() done 43681 1727204697.09459: dumping result to json 43681 1727204697.09467: done dumping result, returning 43681 1727204697.09478: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9e86-7728-000000000275] 43681 1727204697.09488: sending task result for task 12b410aa-8751-9e86-7728-000000000275 43681 1727204697.09611: done sending task result for task 12b410aa-8751-9e86-7728-000000000275 43681 1727204697.09644: no more pending results, returning what we have 43681 1727204697.09650: in VariableManager get_vars() 43681 1727204697.09697: Calling all_inventory to load vars for managed-node3 43681 1727204697.09701: Calling groups_inventory to load vars for managed-node3 43681 1727204697.09704: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.09719: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.09723: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.09728: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.10182: WORKER PROCESS EXITING 43681 1727204697.10213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.10547: done with get_vars() 43681 1727204697.10556: variable 'ansible_search_path' from source: unknown 43681 1727204697.10558: variable 'ansible_search_path' from source: unknown 43681 1727204697.10632: we have included files to process 43681 1727204697.10633: generating all_blocks data 43681 1727204697.10635: done generating all_blocks data 43681 1727204697.10636: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204697.10638: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204697.10640: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 43681 1727204697.10964: done processing included file 43681 1727204697.10966: iterating over new_blocks loaded from include file 43681 1727204697.10968: in VariableManager get_vars() 43681 1727204697.10992: done with get_vars() 43681 1727204697.10995: filtering new block on tags 43681 1727204697.11018: done filtering new block on tags 43681 1727204697.11021: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 43681 1727204697.11028: extending task lists for all hosts with included blocks 43681 1727204697.11234: done extending task lists 43681 1727204697.11235: done processing included files 43681 1727204697.11236: results queue empty 43681 1727204697.11237: checking for any_errors_fatal 43681 1727204697.11240: done checking for any_errors_fatal 43681 1727204697.11241: checking for max_fail_percentage 43681 1727204697.11243: done checking for max_fail_percentage 43681 1727204697.11244: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.11245: done checking to see if all hosts have failed 43681 1727204697.11245: getting the remaining hosts for this loop 43681 1727204697.11247: done getting the remaining hosts for this loop 43681 1727204697.11250: getting the next task for host managed-node3 43681 1727204697.11255: done getting next task for host managed-node3 43681 1727204697.11257: ^ task is: TASK: Gather current interface info 43681 1727204697.11262: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.11264: getting variables 43681 1727204697.11266: in VariableManager get_vars() 43681 1727204697.11278: Calling all_inventory to load vars for managed-node3 43681 1727204697.11280: Calling groups_inventory to load vars for managed-node3 43681 1727204697.11283: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.11288: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.11293: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.11297: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.11513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.11845: done with get_vars() 43681 1727204697.11856: done getting variables 43681 1727204697.11905: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.038) 0:00:04.786 ***** 43681 1727204697.11937: entering _queue_task() for managed-node3/command 43681 1727204697.12207: worker is 1 (out of 1 available) 43681 1727204697.12220: exiting _queue_task() for managed-node3/command 43681 1727204697.12235: done queuing things up, now waiting for results queue to drain 43681 1727204697.12237: waiting for pending results... 43681 1727204697.12506: running TaskExecutor() for managed-node3/TASK: Gather current interface info 43681 1727204697.12655: in run() - task 12b410aa-8751-9e86-7728-0000000002ac 43681 1727204697.12678: variable 'ansible_search_path' from source: unknown 43681 1727204697.12687: variable 'ansible_search_path' from source: unknown 43681 1727204697.12739: calling self._execute() 43681 1727204697.12833: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.12852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.12868: variable 'omit' from source: magic vars 43681 1727204697.13358: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.13377: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.13396: variable 'omit' from source: magic vars 43681 1727204697.13471: variable 'omit' from source: magic vars 43681 1727204697.13694: variable 'omit' from source: magic vars 43681 1727204697.13698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204697.13701: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204697.13703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204697.13705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.13708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.13725: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204697.13734: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.13744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.13876: Set connection var ansible_shell_type to sh 43681 1727204697.13896: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204697.13910: Set connection var ansible_timeout to 10 43681 1727204697.13928: Set connection var ansible_pipelining to False 43681 1727204697.13945: Set connection var ansible_connection to ssh 43681 1727204697.13959: Set connection var ansible_shell_executable to /bin/sh 43681 1727204697.13988: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.14000: variable 'ansible_connection' from source: unknown 43681 1727204697.14009: variable 'ansible_module_compression' from source: unknown 43681 1727204697.14018: variable 'ansible_shell_type' from source: unknown 43681 1727204697.14027: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.14035: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.14049: variable 'ansible_pipelining' from source: unknown 43681 1727204697.14057: variable 'ansible_timeout' from source: unknown 43681 1727204697.14067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.14239: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204697.14261: variable 'omit' from source: magic vars 43681 1727204697.14371: starting attempt loop 43681 1727204697.14374: running the handler 43681 1727204697.14377: _low_level_execute_command(): starting 43681 1727204697.14379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204697.15074: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204697.15094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.15111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.15139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.15253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204697.15269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.15294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.15368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.17142: stdout chunk (state=3): >>>/root <<< 43681 1727204697.17332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.17345: stdout chunk (state=3): >>><<< 43681 1727204697.17361: stderr chunk (state=3): >>><<< 43681 1727204697.17396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.17419: _low_level_execute_command(): starting 43681 1727204697.17432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301 `" && echo ansible-tmp-1727204697.174036-43969-191626749824301="` echo /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301 `" ) && sleep 0' 43681 1727204697.18106: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204697.18125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.18140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.18170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.18275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.18315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204697.18341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.18357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.18433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.20431: stdout chunk (state=3): >>>ansible-tmp-1727204697.174036-43969-191626749824301=/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301 <<< 43681 1727204697.20608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.20628: stdout chunk (state=3): >>><<< 43681 1727204697.20645: stderr chunk (state=3): >>><<< 43681 1727204697.20795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204697.174036-43969-191626749824301=/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.20798: variable 'ansible_module_compression' from source: unknown 43681 1727204697.20801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204697.20803: variable 'ansible_facts' from source: unknown 43681 1727204697.20909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py 43681 1727204697.21167: Sending initial data 43681 1727204697.21171: Sent initial data (155 bytes) 43681 1727204697.21805: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204697.21834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.21911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.21972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.22001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.22082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.23678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204697.23711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204697.23745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpw68decyl /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py <<< 43681 1727204697.23750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py" <<< 43681 1727204697.23779: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpw68decyl" to remote "/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py" <<< 43681 1727204697.23782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py" <<< 43681 1727204697.24554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.24635: stderr chunk (state=3): >>><<< 43681 1727204697.24639: stdout chunk (state=3): >>><<< 43681 1727204697.24662: done transferring module to remote 43681 1727204697.24675: _low_level_execute_command(): starting 43681 1727204697.24680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/ /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py && sleep 0' 43681 1727204697.25153: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.25202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204697.25206: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204697.25210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204697.25213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.25215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.25273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.25323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.27145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.27223: stderr chunk (state=3): >>><<< 43681 1727204697.27226: stdout chunk (state=3): >>><<< 43681 1727204697.27272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.27276: _low_level_execute_command(): starting 43681 1727204697.27279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/AnsiballZ_command.py && sleep 0' 43681 1727204697.27759: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.27763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.27766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.27770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.27825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204697.27834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.27874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.45358: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:57.448814", "end": "2024-09-24 15:04:57.452414", "delta": "0:00:00.003600", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204697.47016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204697.47086: stderr chunk (state=3): >>><<< 43681 1727204697.47093: stdout chunk (state=3): >>><<< 43681 1727204697.47111: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:57.448814", "end": "2024-09-24 15:04:57.452414", "delta": "0:00:00.003600", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204697.47150: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204697.47159: _low_level_execute_command(): starting 43681 1727204697.47165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204697.174036-43969-191626749824301/ > /dev/null 2>&1 && sleep 0' 43681 1727204697.47663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.47667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.47670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.47723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204697.47726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.47772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.49655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.49711: stderr chunk (state=3): >>><<< 43681 1727204697.49715: stdout chunk (state=3): >>><<< 43681 1727204697.49731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.49739: handler run complete 43681 1727204697.49762: Evaluated conditional (False): False 43681 1727204697.49773: attempt loop complete, returning result 43681 1727204697.49776: _execute() done 43681 1727204697.49780: dumping result to json 43681 1727204697.49786: done dumping result, returning 43681 1727204697.49798: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-9e86-7728-0000000002ac] 43681 1727204697.49807: sending task result for task 12b410aa-8751-9e86-7728-0000000002ac 43681 1727204697.49922: done sending task result for task 12b410aa-8751-9e86-7728-0000000002ac 43681 1727204697.49925: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003600", "end": "2024-09-24 15:04:57.452414", "rc": 0, "start": "2024-09-24 15:04:57.448814" } STDOUT: bonding_masters eth0 lo 43681 1727204697.50021: no more pending results, returning what we have 43681 1727204697.50025: results queue empty 43681 1727204697.50027: checking for any_errors_fatal 43681 1727204697.50028: done checking for any_errors_fatal 43681 1727204697.50029: checking for max_fail_percentage 43681 1727204697.50031: done checking for max_fail_percentage 43681 1727204697.50031: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.50032: done checking to see if all hosts have failed 43681 1727204697.50033: getting the remaining hosts for this loop 43681 1727204697.50036: done getting the remaining hosts for this loop 43681 1727204697.50041: getting the next task for host managed-node3 43681 1727204697.50049: done getting next task for host managed-node3 43681 1727204697.50053: ^ task is: TASK: Set current_interfaces 43681 1727204697.50058: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.50062: getting variables 43681 1727204697.50137: in VariableManager get_vars() 43681 1727204697.50164: Calling all_inventory to load vars for managed-node3 43681 1727204697.50166: Calling groups_inventory to load vars for managed-node3 43681 1727204697.50168: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.50177: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.50179: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.50181: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.50335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.50518: done with get_vars() 43681 1727204697.50527: done getting variables 43681 1727204697.50578: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.386) 0:00:05.172 ***** 43681 1727204697.50606: entering _queue_task() for managed-node3/set_fact 43681 1727204697.50827: worker is 1 (out of 1 available) 43681 1727204697.50841: exiting _queue_task() for managed-node3/set_fact 43681 1727204697.50854: done queuing things up, now waiting for results queue to drain 43681 1727204697.50856: waiting for pending results... 43681 1727204697.51020: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 43681 1727204697.51104: in run() - task 12b410aa-8751-9e86-7728-0000000002ad 43681 1727204697.51119: variable 'ansible_search_path' from source: unknown 43681 1727204697.51124: variable 'ansible_search_path' from source: unknown 43681 1727204697.51152: calling self._execute() 43681 1727204697.51224: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.51232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.51241: variable 'omit' from source: magic vars 43681 1727204697.51550: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.51562: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.51568: variable 'omit' from source: magic vars 43681 1727204697.51613: variable 'omit' from source: magic vars 43681 1727204697.51707: variable '_current_interfaces' from source: set_fact 43681 1727204697.51762: variable 'omit' from source: magic vars 43681 1727204697.51796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204697.51829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204697.51848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204697.51867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.51877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.51907: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204697.51910: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.51915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.52002: Set connection var ansible_shell_type to sh 43681 1727204697.52009: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204697.52018: Set connection var ansible_timeout to 10 43681 1727204697.52027: Set connection var ansible_pipelining to False 43681 1727204697.52033: Set connection var ansible_connection to ssh 43681 1727204697.52039: Set connection var ansible_shell_executable to /bin/sh 43681 1727204697.52061: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.52064: variable 'ansible_connection' from source: unknown 43681 1727204697.52075: variable 'ansible_module_compression' from source: unknown 43681 1727204697.52078: variable 'ansible_shell_type' from source: unknown 43681 1727204697.52080: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.52082: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.52085: variable 'ansible_pipelining' from source: unknown 43681 1727204697.52087: variable 'ansible_timeout' from source: unknown 43681 1727204697.52094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.52213: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204697.52223: variable 'omit' from source: magic vars 43681 1727204697.52230: starting attempt loop 43681 1727204697.52233: running the handler 43681 1727204697.52244: handler run complete 43681 1727204697.52254: attempt loop complete, returning result 43681 1727204697.52257: _execute() done 43681 1727204697.52261: dumping result to json 43681 1727204697.52266: done dumping result, returning 43681 1727204697.52275: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-9e86-7728-0000000002ad] 43681 1727204697.52280: sending task result for task 12b410aa-8751-9e86-7728-0000000002ad 43681 1727204697.52370: done sending task result for task 12b410aa-8751-9e86-7728-0000000002ad 43681 1727204697.52373: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 43681 1727204697.52444: no more pending results, returning what we have 43681 1727204697.52447: results queue empty 43681 1727204697.52448: checking for any_errors_fatal 43681 1727204697.52454: done checking for any_errors_fatal 43681 1727204697.52455: checking for max_fail_percentage 43681 1727204697.52457: done checking for max_fail_percentage 43681 1727204697.52458: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.52459: done checking to see if all hosts have failed 43681 1727204697.52460: getting the remaining hosts for this loop 43681 1727204697.52461: done getting the remaining hosts for this loop 43681 1727204697.52465: getting the next task for host managed-node3 43681 1727204697.52472: done getting next task for host managed-node3 43681 1727204697.52475: ^ task is: TASK: Show current_interfaces 43681 1727204697.52479: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.52483: getting variables 43681 1727204697.52484: in VariableManager get_vars() 43681 1727204697.52521: Calling all_inventory to load vars for managed-node3 43681 1727204697.52524: Calling groups_inventory to load vars for managed-node3 43681 1727204697.52527: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.52536: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.52538: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.52540: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.52686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.52894: done with get_vars() 43681 1727204697.52902: done getting variables 43681 1727204697.52948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.023) 0:00:05.196 ***** 43681 1727204697.52974: entering _queue_task() for managed-node3/debug 43681 1727204697.53171: worker is 1 (out of 1 available) 43681 1727204697.53186: exiting _queue_task() for managed-node3/debug 43681 1727204697.53200: done queuing things up, now waiting for results queue to drain 43681 1727204697.53203: waiting for pending results... 43681 1727204697.53352: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 43681 1727204697.53428: in run() - task 12b410aa-8751-9e86-7728-000000000276 43681 1727204697.53444: variable 'ansible_search_path' from source: unknown 43681 1727204697.53448: variable 'ansible_search_path' from source: unknown 43681 1727204697.53475: calling self._execute() 43681 1727204697.53548: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.53554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.53563: variable 'omit' from source: magic vars 43681 1727204697.53855: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.53867: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.53876: variable 'omit' from source: magic vars 43681 1727204697.53918: variable 'omit' from source: magic vars 43681 1727204697.53998: variable 'current_interfaces' from source: set_fact 43681 1727204697.54022: variable 'omit' from source: magic vars 43681 1727204697.54053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204697.54083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204697.54105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204697.54122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.54131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.54157: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204697.54160: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.54166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.54249: Set connection var ansible_shell_type to sh 43681 1727204697.54256: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204697.54262: Set connection var ansible_timeout to 10 43681 1727204697.54271: Set connection var ansible_pipelining to False 43681 1727204697.54277: Set connection var ansible_connection to ssh 43681 1727204697.54284: Set connection var ansible_shell_executable to /bin/sh 43681 1727204697.54307: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.54312: variable 'ansible_connection' from source: unknown 43681 1727204697.54315: variable 'ansible_module_compression' from source: unknown 43681 1727204697.54320: variable 'ansible_shell_type' from source: unknown 43681 1727204697.54323: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.54325: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.54327: variable 'ansible_pipelining' from source: unknown 43681 1727204697.54330: variable 'ansible_timeout' from source: unknown 43681 1727204697.54338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.54452: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204697.54462: variable 'omit' from source: magic vars 43681 1727204697.54468: starting attempt loop 43681 1727204697.54470: running the handler 43681 1727204697.54511: handler run complete 43681 1727204697.54528: attempt loop complete, returning result 43681 1727204697.54531: _execute() done 43681 1727204697.54534: dumping result to json 43681 1727204697.54539: done dumping result, returning 43681 1727204697.54546: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-9e86-7728-000000000276] 43681 1727204697.54556: sending task result for task 12b410aa-8751-9e86-7728-000000000276 43681 1727204697.54644: done sending task result for task 12b410aa-8751-9e86-7728-000000000276 43681 1727204697.54647: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 43681 1727204697.54708: no more pending results, returning what we have 43681 1727204697.54711: results queue empty 43681 1727204697.54713: checking for any_errors_fatal 43681 1727204697.54720: done checking for any_errors_fatal 43681 1727204697.54721: checking for max_fail_percentage 43681 1727204697.54722: done checking for max_fail_percentage 43681 1727204697.54723: checking to see if all hosts have failed and the running result is not ok 43681 1727204697.54724: done checking to see if all hosts have failed 43681 1727204697.54725: getting the remaining hosts for this loop 43681 1727204697.54726: done getting the remaining hosts for this loop 43681 1727204697.54730: getting the next task for host managed-node3 43681 1727204697.54737: done getting next task for host managed-node3 43681 1727204697.54739: ^ task is: TASK: Install iproute 43681 1727204697.54742: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204697.54747: getting variables 43681 1727204697.54749: in VariableManager get_vars() 43681 1727204697.54782: Calling all_inventory to load vars for managed-node3 43681 1727204697.54784: Calling groups_inventory to load vars for managed-node3 43681 1727204697.54786: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204697.54794: Calling all_plugins_play to load vars for managed-node3 43681 1727204697.54797: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204697.54799: Calling groups_plugins_play to load vars for managed-node3 43681 1727204697.54945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204697.55127: done with get_vars() 43681 1727204697.55135: done getting variables 43681 1727204697.55178: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.022) 0:00:05.218 ***** 43681 1727204697.55203: entering _queue_task() for managed-node3/package 43681 1727204697.55394: worker is 1 (out of 1 available) 43681 1727204697.55408: exiting _queue_task() for managed-node3/package 43681 1727204697.55423: done queuing things up, now waiting for results queue to drain 43681 1727204697.55425: waiting for pending results... 43681 1727204697.55572: running TaskExecutor() for managed-node3/TASK: Install iproute 43681 1727204697.55643: in run() - task 12b410aa-8751-9e86-7728-0000000001cf 43681 1727204697.55662: variable 'ansible_search_path' from source: unknown 43681 1727204697.55665: variable 'ansible_search_path' from source: unknown 43681 1727204697.55695: calling self._execute() 43681 1727204697.55757: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.55762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.55776: variable 'omit' from source: magic vars 43681 1727204697.56065: variable 'ansible_distribution_major_version' from source: facts 43681 1727204697.56075: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204697.56083: variable 'omit' from source: magic vars 43681 1727204697.56120: variable 'omit' from source: magic vars 43681 1727204697.56273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204697.58129: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204697.58183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204697.58219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204697.58247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204697.58270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204697.58353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204697.58377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204697.58405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204697.58438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204697.58451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204697.58536: variable '__network_is_ostree' from source: set_fact 43681 1727204697.58540: variable 'omit' from source: magic vars 43681 1727204697.58564: variable 'omit' from source: magic vars 43681 1727204697.58587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204697.58615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204697.58633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204697.58648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.58659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204697.58684: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204697.58688: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.58694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.58776: Set connection var ansible_shell_type to sh 43681 1727204697.58783: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204697.58791: Set connection var ansible_timeout to 10 43681 1727204697.58800: Set connection var ansible_pipelining to False 43681 1727204697.58807: Set connection var ansible_connection to ssh 43681 1727204697.58813: Set connection var ansible_shell_executable to /bin/sh 43681 1727204697.58834: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.58838: variable 'ansible_connection' from source: unknown 43681 1727204697.58841: variable 'ansible_module_compression' from source: unknown 43681 1727204697.58843: variable 'ansible_shell_type' from source: unknown 43681 1727204697.58846: variable 'ansible_shell_executable' from source: unknown 43681 1727204697.58855: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204697.58857: variable 'ansible_pipelining' from source: unknown 43681 1727204697.58860: variable 'ansible_timeout' from source: unknown 43681 1727204697.58865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204697.58950: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204697.58959: variable 'omit' from source: magic vars 43681 1727204697.58971: starting attempt loop 43681 1727204697.58975: running the handler 43681 1727204697.58981: variable 'ansible_facts' from source: unknown 43681 1727204697.58984: variable 'ansible_facts' from source: unknown 43681 1727204697.59017: _low_level_execute_command(): starting 43681 1727204697.59026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204697.59561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.59565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.59568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.59570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.59633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204697.59638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.59681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.61410: stdout chunk (state=3): >>>/root <<< 43681 1727204697.61521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.61575: stderr chunk (state=3): >>><<< 43681 1727204697.61579: stdout chunk (state=3): >>><<< 43681 1727204697.61603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.61614: _low_level_execute_command(): starting 43681 1727204697.61620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258 `" && echo ansible-tmp-1727204697.616014-43985-23304879422258="` echo /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258 `" ) && sleep 0' 43681 1727204697.62073: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.62077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204697.62079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204697.62082: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.62084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.62144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.62151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.62182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.64137: stdout chunk (state=3): >>>ansible-tmp-1727204697.616014-43985-23304879422258=/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258 <<< 43681 1727204697.64255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.64302: stderr chunk (state=3): >>><<< 43681 1727204697.64305: stdout chunk (state=3): >>><<< 43681 1727204697.64323: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204697.616014-43985-23304879422258=/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.64349: variable 'ansible_module_compression' from source: unknown 43681 1727204697.64398: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 43681 1727204697.64404: ANSIBALLZ: Acquiring lock 43681 1727204697.64407: ANSIBALLZ: Lock acquired: 140156138759584 43681 1727204697.64409: ANSIBALLZ: Creating module 43681 1727204697.79309: ANSIBALLZ: Writing module into payload 43681 1727204697.79388: ANSIBALLZ: Writing module 43681 1727204697.79419: ANSIBALLZ: Renaming module 43681 1727204697.79432: ANSIBALLZ: Done creating module 43681 1727204697.79457: variable 'ansible_facts' from source: unknown 43681 1727204697.79567: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py 43681 1727204697.79837: Sending initial data 43681 1727204697.79848: Sent initial data (150 bytes) 43681 1727204697.80475: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.80506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.80578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.82311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204697.82367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204697.82406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpjf5yxi38 /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py <<< 43681 1727204697.82423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py" <<< 43681 1727204697.82450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpjf5yxi38" to remote "/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py" <<< 43681 1727204697.84159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.84200: stderr chunk (state=3): >>><<< 43681 1727204697.84210: stdout chunk (state=3): >>><<< 43681 1727204697.84244: done transferring module to remote 43681 1727204697.84264: _low_level_execute_command(): starting 43681 1727204697.84280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/ /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py && sleep 0' 43681 1727204697.84984: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204697.85010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.85029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.85064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.85171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.85193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.85428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204697.87303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204697.87373: stderr chunk (state=3): >>><<< 43681 1727204697.87388: stdout chunk (state=3): >>><<< 43681 1727204697.87411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204697.87420: _low_level_execute_command(): starting 43681 1727204697.87430: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/AnsiballZ_dnf.py && sleep 0' 43681 1727204697.88062: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204697.88076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204697.88092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204697.88113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204697.88132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204697.88209: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204697.88213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204697.88266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204697.88283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204697.88307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204697.88403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.34174: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 43681 1727204699.38918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204699.38985: stderr chunk (state=3): >>><<< 43681 1727204699.38992: stdout chunk (state=3): >>><<< 43681 1727204699.39009: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204699.39058: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204699.39068: _low_level_execute_command(): starting 43681 1727204699.39075: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204697.616014-43985-23304879422258/ > /dev/null 2>&1 && sleep 0' 43681 1727204699.39570: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.39574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.39578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204699.39581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204699.39583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.39648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.39651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204699.39653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.39683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.41586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.41639: stderr chunk (state=3): >>><<< 43681 1727204699.41642: stdout chunk (state=3): >>><<< 43681 1727204699.41657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.41666: handler run complete 43681 1727204699.41816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204699.41975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204699.42012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204699.42043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204699.42071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204699.42134: variable '__install_status' from source: unknown 43681 1727204699.42152: Evaluated conditional (__install_status is success): True 43681 1727204699.42173: attempt loop complete, returning result 43681 1727204699.42176: _execute() done 43681 1727204699.42179: dumping result to json 43681 1727204699.42181: done dumping result, returning 43681 1727204699.42192: done running TaskExecutor() for managed-node3/TASK: Install iproute [12b410aa-8751-9e86-7728-0000000001cf] 43681 1727204699.42197: sending task result for task 12b410aa-8751-9e86-7728-0000000001cf 43681 1727204699.42307: done sending task result for task 12b410aa-8751-9e86-7728-0000000001cf 43681 1727204699.42310: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 43681 1727204699.42416: no more pending results, returning what we have 43681 1727204699.42422: results queue empty 43681 1727204699.42424: checking for any_errors_fatal 43681 1727204699.42429: done checking for any_errors_fatal 43681 1727204699.42430: checking for max_fail_percentage 43681 1727204699.42432: done checking for max_fail_percentage 43681 1727204699.42433: checking to see if all hosts have failed and the running result is not ok 43681 1727204699.42434: done checking to see if all hosts have failed 43681 1727204699.42435: getting the remaining hosts for this loop 43681 1727204699.42437: done getting the remaining hosts for this loop 43681 1727204699.42441: getting the next task for host managed-node3 43681 1727204699.42447: done getting next task for host managed-node3 43681 1727204699.42450: ^ task is: TASK: Create veth interface {{ interface }} 43681 1727204699.42454: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204699.42458: getting variables 43681 1727204699.42460: in VariableManager get_vars() 43681 1727204699.42507: Calling all_inventory to load vars for managed-node3 43681 1727204699.42510: Calling groups_inventory to load vars for managed-node3 43681 1727204699.42513: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204699.42524: Calling all_plugins_play to load vars for managed-node3 43681 1727204699.42527: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204699.42531: Calling groups_plugins_play to load vars for managed-node3 43681 1727204699.42771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204699.42956: done with get_vars() 43681 1727204699.42965: done getting variables 43681 1727204699.43017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204699.43116: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:04:59 -0400 (0:00:01.879) 0:00:07.098 ***** 43681 1727204699.43143: entering _queue_task() for managed-node3/command 43681 1727204699.43367: worker is 1 (out of 1 available) 43681 1727204699.43383: exiting _queue_task() for managed-node3/command 43681 1727204699.43399: done queuing things up, now waiting for results queue to drain 43681 1727204699.43401: waiting for pending results... 43681 1727204699.43573: running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 43681 1727204699.43654: in run() - task 12b410aa-8751-9e86-7728-0000000001d0 43681 1727204699.43667: variable 'ansible_search_path' from source: unknown 43681 1727204699.43671: variable 'ansible_search_path' from source: unknown 43681 1727204699.43895: variable 'interface' from source: set_fact 43681 1727204699.43967: variable 'interface' from source: set_fact 43681 1727204699.44031: variable 'interface' from source: set_fact 43681 1727204699.44157: Loaded config def from plugin (lookup/items) 43681 1727204699.44164: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 43681 1727204699.44185: variable 'omit' from source: magic vars 43681 1727204699.44277: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.44288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.44302: variable 'omit' from source: magic vars 43681 1727204699.44494: variable 'ansible_distribution_major_version' from source: facts 43681 1727204699.44503: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204699.44674: variable 'type' from source: set_fact 43681 1727204699.44677: variable 'state' from source: include params 43681 1727204699.44683: variable 'interface' from source: set_fact 43681 1727204699.44688: variable 'current_interfaces' from source: set_fact 43681 1727204699.44697: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 43681 1727204699.44704: variable 'omit' from source: magic vars 43681 1727204699.44739: variable 'omit' from source: magic vars 43681 1727204699.44778: variable 'item' from source: unknown 43681 1727204699.44841: variable 'item' from source: unknown 43681 1727204699.44856: variable 'omit' from source: magic vars 43681 1727204699.44885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204699.44913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204699.44931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204699.44949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204699.44961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204699.44987: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204699.44992: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.44997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.45081: Set connection var ansible_shell_type to sh 43681 1727204699.45088: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204699.45100: Set connection var ansible_timeout to 10 43681 1727204699.45109: Set connection var ansible_pipelining to False 43681 1727204699.45115: Set connection var ansible_connection to ssh 43681 1727204699.45123: Set connection var ansible_shell_executable to /bin/sh 43681 1727204699.45139: variable 'ansible_shell_executable' from source: unknown 43681 1727204699.45142: variable 'ansible_connection' from source: unknown 43681 1727204699.45145: variable 'ansible_module_compression' from source: unknown 43681 1727204699.45150: variable 'ansible_shell_type' from source: unknown 43681 1727204699.45153: variable 'ansible_shell_executable' from source: unknown 43681 1727204699.45158: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.45167: variable 'ansible_pipelining' from source: unknown 43681 1727204699.45170: variable 'ansible_timeout' from source: unknown 43681 1727204699.45195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.45291: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204699.45304: variable 'omit' from source: magic vars 43681 1727204699.45309: starting attempt loop 43681 1727204699.45312: running the handler 43681 1727204699.45328: _low_level_execute_command(): starting 43681 1727204699.45335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204699.45874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.45877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.45880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.45883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.45942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.45948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.45984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.47656: stdout chunk (state=3): >>>/root <<< 43681 1727204699.47765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.47821: stderr chunk (state=3): >>><<< 43681 1727204699.47825: stdout chunk (state=3): >>><<< 43681 1727204699.47844: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.47854: _low_level_execute_command(): starting 43681 1727204699.47860: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920 `" && echo ansible-tmp-1727204699.4784253-44028-124621268782920="` echo /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920 `" ) && sleep 0' 43681 1727204699.48325: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.48330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204699.48334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204699.48337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204699.48340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.48393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.48401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.48439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.50428: stdout chunk (state=3): >>>ansible-tmp-1727204699.4784253-44028-124621268782920=/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920 <<< 43681 1727204699.50547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.50594: stderr chunk (state=3): >>><<< 43681 1727204699.50598: stdout chunk (state=3): >>><<< 43681 1727204699.50622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204699.4784253-44028-124621268782920=/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.50645: variable 'ansible_module_compression' from source: unknown 43681 1727204699.50685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204699.50721: variable 'ansible_facts' from source: unknown 43681 1727204699.50784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py 43681 1727204699.50901: Sending initial data 43681 1727204699.50904: Sent initial data (156 bytes) 43681 1727204699.51363: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.51369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204699.51371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204699.51374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204699.51378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.51431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.51436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.51469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.53071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204699.53086: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204699.53106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204699.53147: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpfor_r2f2 /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py <<< 43681 1727204699.53159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py" <<< 43681 1727204699.53177: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpfor_r2f2" to remote "/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py" <<< 43681 1727204699.53939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.54000: stderr chunk (state=3): >>><<< 43681 1727204699.54007: stdout chunk (state=3): >>><<< 43681 1727204699.54028: done transferring module to remote 43681 1727204699.54038: _low_level_execute_command(): starting 43681 1727204699.54043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/ /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py && sleep 0' 43681 1727204699.54494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.54498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.54504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.54609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.54645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.56467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.56519: stderr chunk (state=3): >>><<< 43681 1727204699.56523: stdout chunk (state=3): >>><<< 43681 1727204699.56540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.56543: _low_level_execute_command(): starting 43681 1727204699.56549: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/AnsiballZ_command.py && sleep 0' 43681 1727204699.56995: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.56998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.57001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204699.57003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.57060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204699.57065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.57106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.75164: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:04:59.741912", "end": "2024-09-24 15:04:59.749623", "delta": "0:00:00.007711", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204699.78082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204699.78086: stdout chunk (state=3): >>><<< 43681 1727204699.78088: stderr chunk (state=3): >>><<< 43681 1727204699.78275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:04:59.741912", "end": "2024-09-24 15:04:59.749623", "delta": "0:00:00.007711", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204699.78279: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204699.78281: _low_level_execute_command(): starting 43681 1727204699.78284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204699.4784253-44028-124621268782920/ > /dev/null 2>&1 && sleep 0' 43681 1727204699.78957: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.79032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.79062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204699.79082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.79167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.84228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.84291: stderr chunk (state=3): >>><<< 43681 1727204699.84295: stdout chunk (state=3): >>><<< 43681 1727204699.84310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.84317: handler run complete 43681 1727204699.84341: Evaluated conditional (False): False 43681 1727204699.84353: attempt loop complete, returning result 43681 1727204699.84375: variable 'item' from source: unknown 43681 1727204699.84448: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.007711", "end": "2024-09-24 15:04:59.749623", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:04:59.741912" } 43681 1727204699.84637: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.84641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.84644: variable 'omit' from source: magic vars 43681 1727204699.85040: variable 'ansible_distribution_major_version' from source: facts 43681 1727204699.85045: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204699.85207: variable 'type' from source: set_fact 43681 1727204699.85211: variable 'state' from source: include params 43681 1727204699.85217: variable 'interface' from source: set_fact 43681 1727204699.85224: variable 'current_interfaces' from source: set_fact 43681 1727204699.85231: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 43681 1727204699.85236: variable 'omit' from source: magic vars 43681 1727204699.85250: variable 'omit' from source: magic vars 43681 1727204699.85284: variable 'item' from source: unknown 43681 1727204699.85342: variable 'item' from source: unknown 43681 1727204699.85355: variable 'omit' from source: magic vars 43681 1727204699.85375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204699.85383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204699.85392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204699.85405: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204699.85410: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.85413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.85479: Set connection var ansible_shell_type to sh 43681 1727204699.85485: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204699.85493: Set connection var ansible_timeout to 10 43681 1727204699.85502: Set connection var ansible_pipelining to False 43681 1727204699.85508: Set connection var ansible_connection to ssh 43681 1727204699.85514: Set connection var ansible_shell_executable to /bin/sh 43681 1727204699.85560: variable 'ansible_shell_executable' from source: unknown 43681 1727204699.85563: variable 'ansible_connection' from source: unknown 43681 1727204699.85566: variable 'ansible_module_compression' from source: unknown 43681 1727204699.85568: variable 'ansible_shell_type' from source: unknown 43681 1727204699.85571: variable 'ansible_shell_executable' from source: unknown 43681 1727204699.85573: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204699.85575: variable 'ansible_pipelining' from source: unknown 43681 1727204699.85577: variable 'ansible_timeout' from source: unknown 43681 1727204699.85580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204699.85808: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204699.85812: variable 'omit' from source: magic vars 43681 1727204699.85814: starting attempt loop 43681 1727204699.85820: running the handler 43681 1727204699.85823: _low_level_execute_command(): starting 43681 1727204699.85825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204699.86370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204699.86405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.86509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204699.86536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.86603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.88267: stdout chunk (state=3): >>>/root <<< 43681 1727204699.88379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.88429: stderr chunk (state=3): >>><<< 43681 1727204699.88432: stdout chunk (state=3): >>><<< 43681 1727204699.88448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.88457: _low_level_execute_command(): starting 43681 1727204699.88465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558 `" && echo ansible-tmp-1727204699.8844767-44028-27930895927558="` echo /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558 `" ) && sleep 0' 43681 1727204699.89017: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.89050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.89094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.96662: stdout chunk (state=3): >>>ansible-tmp-1727204699.8844767-44028-27930895927558=/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558 <<< 43681 1727204699.96782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204699.96838: stderr chunk (state=3): >>><<< 43681 1727204699.96842: stdout chunk (state=3): >>><<< 43681 1727204699.96857: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204699.8844767-44028-27930895927558=/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204699.96880: variable 'ansible_module_compression' from source: unknown 43681 1727204699.96921: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204699.96937: variable 'ansible_facts' from source: unknown 43681 1727204699.96982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py 43681 1727204699.97087: Sending initial data 43681 1727204699.97093: Sent initial data (155 bytes) 43681 1727204699.97541: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.97545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.97549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204699.97551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204699.97608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204699.97611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204699.97651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204699.99270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204699.99363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204699.99367: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpjepd1_m9 /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py <<< 43681 1727204699.99370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py" <<< 43681 1727204699.99422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpjepd1_m9" to remote "/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py" <<< 43681 1727204700.00589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.00769: stderr chunk (state=3): >>><<< 43681 1727204700.00773: stdout chunk (state=3): >>><<< 43681 1727204700.00776: done transferring module to remote 43681 1727204700.00778: _low_level_execute_command(): starting 43681 1727204700.00781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/ /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py && sleep 0' 43681 1727204700.01370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.01382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.01397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.01415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.01443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204700.01454: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.01495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.01507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.01551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.03432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.03494: stderr chunk (state=3): >>><<< 43681 1727204700.03499: stdout chunk (state=3): >>><<< 43681 1727204700.03518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.03522: _low_level_execute_command(): starting 43681 1727204700.03525: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/AnsiballZ_command.py && sleep 0' 43681 1727204700.03996: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.04000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.04011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.04074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.04077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.04130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.22108: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:05:00.215811", "end": "2024-09-24 15:05:00.219726", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204700.24002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204700.24007: stdout chunk (state=3): >>><<< 43681 1727204700.24009: stderr chunk (state=3): >>><<< 43681 1727204700.24012: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:05:00.215811", "end": "2024-09-24 15:05:00.219726", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204700.24015: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204700.24021: _low_level_execute_command(): starting 43681 1727204700.24024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204699.8844767-44028-27930895927558/ > /dev/null 2>&1 && sleep 0' 43681 1727204700.24745: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.24773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.24897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.24928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.25004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.27055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.27074: stdout chunk (state=3): >>><<< 43681 1727204700.27087: stderr chunk (state=3): >>><<< 43681 1727204700.27113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.27128: handler run complete 43681 1727204700.27161: Evaluated conditional (False): False 43681 1727204700.27281: attempt loop complete, returning result 43681 1727204700.27284: variable 'item' from source: unknown 43681 1727204700.27330: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003915", "end": "2024-09-24 15:05:00.219726", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:05:00.215811" } 43681 1727204700.27800: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.27804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.27806: variable 'omit' from source: magic vars 43681 1727204700.27831: variable 'ansible_distribution_major_version' from source: facts 43681 1727204700.27844: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204700.28111: variable 'type' from source: set_fact 43681 1727204700.28131: variable 'state' from source: include params 43681 1727204700.28141: variable 'interface' from source: set_fact 43681 1727204700.28150: variable 'current_interfaces' from source: set_fact 43681 1727204700.28161: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 43681 1727204700.28170: variable 'omit' from source: magic vars 43681 1727204700.28199: variable 'omit' from source: magic vars 43681 1727204700.28265: variable 'item' from source: unknown 43681 1727204700.28353: variable 'item' from source: unknown 43681 1727204700.28376: variable 'omit' from source: magic vars 43681 1727204700.28406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204700.28422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204700.28434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204700.28465: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204700.28475: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.28483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.28597: Set connection var ansible_shell_type to sh 43681 1727204700.28672: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204700.28675: Set connection var ansible_timeout to 10 43681 1727204700.28678: Set connection var ansible_pipelining to False 43681 1727204700.28680: Set connection var ansible_connection to ssh 43681 1727204700.28682: Set connection var ansible_shell_executable to /bin/sh 43681 1727204700.28684: variable 'ansible_shell_executable' from source: unknown 43681 1727204700.28692: variable 'ansible_connection' from source: unknown 43681 1727204700.28701: variable 'ansible_module_compression' from source: unknown 43681 1727204700.28708: variable 'ansible_shell_type' from source: unknown 43681 1727204700.28715: variable 'ansible_shell_executable' from source: unknown 43681 1727204700.28725: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.28733: variable 'ansible_pipelining' from source: unknown 43681 1727204700.28740: variable 'ansible_timeout' from source: unknown 43681 1727204700.28749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.28872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204700.28898: variable 'omit' from source: magic vars 43681 1727204700.28999: starting attempt loop 43681 1727204700.29003: running the handler 43681 1727204700.29005: _low_level_execute_command(): starting 43681 1727204700.29007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204700.29634: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.29654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.29788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.29813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.29833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.29855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.29928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.31712: stdout chunk (state=3): >>>/root <<< 43681 1727204700.31844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.31926: stderr chunk (state=3): >>><<< 43681 1727204700.31949: stdout chunk (state=3): >>><<< 43681 1727204700.31971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.31986: _low_level_execute_command(): starting 43681 1727204700.31998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488 `" && echo ansible-tmp-1727204700.3197672-44028-185882668724488="` echo /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488 `" ) && sleep 0' 43681 1727204700.32663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.32680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.32698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.32719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204700.32760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204700.32776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.32871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.32893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.32908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.32983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.35056: stdout chunk (state=3): >>>ansible-tmp-1727204700.3197672-44028-185882668724488=/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488 <<< 43681 1727204700.35264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.35295: stdout chunk (state=3): >>><<< 43681 1727204700.35298: stderr chunk (state=3): >>><<< 43681 1727204700.35495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204700.3197672-44028-185882668724488=/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.35499: variable 'ansible_module_compression' from source: unknown 43681 1727204700.35502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204700.35504: variable 'ansible_facts' from source: unknown 43681 1727204700.35506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py 43681 1727204700.35762: Sending initial data 43681 1727204700.35765: Sent initial data (156 bytes) 43681 1727204700.36369: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.36393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.36413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.36509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.36548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.36567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.36594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.36669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.38412: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204700.38479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204700.38540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpvpb23ops /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py <<< 43681 1727204700.38577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpvpb23ops" to remote "/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py" <<< 43681 1727204700.39691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.39767: stderr chunk (state=3): >>><<< 43681 1727204700.39788: stdout chunk (state=3): >>><<< 43681 1727204700.39890: done transferring module to remote 43681 1727204700.39895: _low_level_execute_command(): starting 43681 1727204700.39898: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/ /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py && sleep 0' 43681 1727204700.40514: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.40534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.40558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.40667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.40685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.40712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.40731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.40803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.42859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.42863: stdout chunk (state=3): >>><<< 43681 1727204700.42872: stderr chunk (state=3): >>><<< 43681 1727204700.42895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.42898: _low_level_execute_command(): starting 43681 1727204700.42904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/AnsiballZ_command.py && sleep 0' 43681 1727204700.43599: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204700.43604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.43610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.43700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204700.43704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204700.43706: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204700.43708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.43710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204700.43713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204700.43757: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.43792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.43812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.43833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.43966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.62286: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:05:00.616937", "end": "2024-09-24 15:05:00.621243", "delta": "0:00:00.004306", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204700.64005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204700.64070: stderr chunk (state=3): >>><<< 43681 1727204700.64074: stdout chunk (state=3): >>><<< 43681 1727204700.64091: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:05:00.616937", "end": "2024-09-24 15:05:00.621243", "delta": "0:00:00.004306", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204700.64122: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204700.64130: _low_level_execute_command(): starting 43681 1727204700.64137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204700.3197672-44028-185882668724488/ > /dev/null 2>&1 && sleep 0' 43681 1727204700.64624: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.64628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204700.64631: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.64633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.64635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.64690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.64701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.64735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.66691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.66747: stderr chunk (state=3): >>><<< 43681 1727204700.66752: stdout chunk (state=3): >>><<< 43681 1727204700.66767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.66773: handler run complete 43681 1727204700.66796: Evaluated conditional (False): False 43681 1727204700.66806: attempt loop complete, returning result 43681 1727204700.66826: variable 'item' from source: unknown 43681 1727204700.66901: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004306", "end": "2024-09-24 15:05:00.621243", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:05:00.616937" } 43681 1727204700.67035: dumping result to json 43681 1727204700.67038: done dumping result, returning 43681 1727204700.67041: done running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d0] 43681 1727204700.67044: sending task result for task 12b410aa-8751-9e86-7728-0000000001d0 43681 1727204700.67185: no more pending results, returning what we have 43681 1727204700.67192: results queue empty 43681 1727204700.67193: checking for any_errors_fatal 43681 1727204700.67199: done checking for any_errors_fatal 43681 1727204700.67207: checking for max_fail_percentage 43681 1727204700.67209: done checking for max_fail_percentage 43681 1727204700.67209: checking to see if all hosts have failed and the running result is not ok 43681 1727204700.67211: done checking to see if all hosts have failed 43681 1727204700.67211: getting the remaining hosts for this loop 43681 1727204700.67213: done getting the remaining hosts for this loop 43681 1727204700.67218: getting the next task for host managed-node3 43681 1727204700.67223: done getting next task for host managed-node3 43681 1727204700.67227: ^ task is: TASK: Set up veth as managed by NetworkManager 43681 1727204700.67230: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204700.67234: getting variables 43681 1727204700.67236: in VariableManager get_vars() 43681 1727204700.67273: Calling all_inventory to load vars for managed-node3 43681 1727204700.67276: Calling groups_inventory to load vars for managed-node3 43681 1727204700.67279: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204700.67286: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d0 43681 1727204700.67291: WORKER PROCESS EXITING 43681 1727204700.67301: Calling all_plugins_play to load vars for managed-node3 43681 1727204700.67305: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204700.67314: Calling groups_plugins_play to load vars for managed-node3 43681 1727204700.67724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204700.67902: done with get_vars() 43681 1727204700.67912: done getting variables 43681 1727204700.67961: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:05:00 -0400 (0:00:01.248) 0:00:08.346 ***** 43681 1727204700.67986: entering _queue_task() for managed-node3/command 43681 1727204700.68223: worker is 1 (out of 1 available) 43681 1727204700.68238: exiting _queue_task() for managed-node3/command 43681 1727204700.68251: done queuing things up, now waiting for results queue to drain 43681 1727204700.68253: waiting for pending results... 43681 1727204700.68428: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 43681 1727204700.68508: in run() - task 12b410aa-8751-9e86-7728-0000000001d1 43681 1727204700.68525: variable 'ansible_search_path' from source: unknown 43681 1727204700.68529: variable 'ansible_search_path' from source: unknown 43681 1727204700.68561: calling self._execute() 43681 1727204700.68638: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.68644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.68654: variable 'omit' from source: magic vars 43681 1727204700.68988: variable 'ansible_distribution_major_version' from source: facts 43681 1727204700.69000: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204700.69142: variable 'type' from source: set_fact 43681 1727204700.69146: variable 'state' from source: include params 43681 1727204700.69154: Evaluated conditional (type == 'veth' and state == 'present'): True 43681 1727204700.69161: variable 'omit' from source: magic vars 43681 1727204700.69194: variable 'omit' from source: magic vars 43681 1727204700.69279: variable 'interface' from source: set_fact 43681 1727204700.69297: variable 'omit' from source: magic vars 43681 1727204700.69335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204700.69371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204700.69398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204700.69414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204700.69428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204700.69457: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204700.69460: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.69463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.69552: Set connection var ansible_shell_type to sh 43681 1727204700.69558: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204700.69565: Set connection var ansible_timeout to 10 43681 1727204700.69575: Set connection var ansible_pipelining to False 43681 1727204700.69586: Set connection var ansible_connection to ssh 43681 1727204700.69588: Set connection var ansible_shell_executable to /bin/sh 43681 1727204700.69611: variable 'ansible_shell_executable' from source: unknown 43681 1727204700.69615: variable 'ansible_connection' from source: unknown 43681 1727204700.69617: variable 'ansible_module_compression' from source: unknown 43681 1727204700.69623: variable 'ansible_shell_type' from source: unknown 43681 1727204700.69626: variable 'ansible_shell_executable' from source: unknown 43681 1727204700.69631: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204700.69636: variable 'ansible_pipelining' from source: unknown 43681 1727204700.69639: variable 'ansible_timeout' from source: unknown 43681 1727204700.69644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204700.69769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204700.69780: variable 'omit' from source: magic vars 43681 1727204700.69786: starting attempt loop 43681 1727204700.69793: running the handler 43681 1727204700.69810: _low_level_execute_command(): starting 43681 1727204700.69817: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204700.70374: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.70379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.70385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.70442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204700.70446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.70497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.72199: stdout chunk (state=3): >>>/root <<< 43681 1727204700.72306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.72370: stderr chunk (state=3): >>><<< 43681 1727204700.72374: stdout chunk (state=3): >>><<< 43681 1727204700.72400: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.72413: _low_level_execute_command(): starting 43681 1727204700.72424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278 `" && echo ansible-tmp-1727204700.7240021-44075-237437942127278="` echo /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278 `" ) && sleep 0' 43681 1727204700.72895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.72918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204700.72931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.72986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.72995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.73031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.74997: stdout chunk (state=3): >>>ansible-tmp-1727204700.7240021-44075-237437942127278=/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278 <<< 43681 1727204700.75110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.75171: stderr chunk (state=3): >>><<< 43681 1727204700.75175: stdout chunk (state=3): >>><<< 43681 1727204700.75194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204700.7240021-44075-237437942127278=/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.75227: variable 'ansible_module_compression' from source: unknown 43681 1727204700.75278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204700.75315: variable 'ansible_facts' from source: unknown 43681 1727204700.75383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py 43681 1727204700.75508: Sending initial data 43681 1727204700.75512: Sent initial data (156 bytes) 43681 1727204700.75992: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.75996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204700.75999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204700.76003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204700.76006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.76058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.76061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.76106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.77700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 43681 1727204700.77704: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204700.77739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204700.77769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpgguse3dj /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py <<< 43681 1727204700.77773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py" <<< 43681 1727204700.77802: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpgguse3dj" to remote "/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py" <<< 43681 1727204700.78572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.78647: stderr chunk (state=3): >>><<< 43681 1727204700.78650: stdout chunk (state=3): >>><<< 43681 1727204700.78670: done transferring module to remote 43681 1727204700.78681: _low_level_execute_command(): starting 43681 1727204700.78687: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/ /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py && sleep 0' 43681 1727204700.79162: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.79165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204700.79168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204700.79170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.79229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.79238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.79270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204700.81087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204700.81146: stderr chunk (state=3): >>><<< 43681 1727204700.81151: stdout chunk (state=3): >>><<< 43681 1727204700.81167: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204700.81171: _low_level_execute_command(): starting 43681 1727204700.81176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/AnsiballZ_command.py && sleep 0' 43681 1727204700.81652: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.81656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.81661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204700.81664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204700.81721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204700.81726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204700.81769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.01113: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:05:00.989220", "end": "2024-09-24 15:05:01.009834", "delta": "0:00:00.020614", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204701.03098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204701.03102: stdout chunk (state=3): >>><<< 43681 1727204701.03105: stderr chunk (state=3): >>><<< 43681 1727204701.03108: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:05:00.989220", "end": "2024-09-24 15:05:01.009834", "delta": "0:00:00.020614", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204701.03111: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204701.03113: _low_level_execute_command(): starting 43681 1727204701.03115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204700.7240021-44075-237437942127278/ > /dev/null 2>&1 && sleep 0' 43681 1727204701.03851: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.03888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.03974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.05942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.06005: stderr chunk (state=3): >>><<< 43681 1727204701.06008: stdout chunk (state=3): >>><<< 43681 1727204701.06027: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.06034: handler run complete 43681 1727204701.06056: Evaluated conditional (False): False 43681 1727204701.06076: attempt loop complete, returning result 43681 1727204701.06079: _execute() done 43681 1727204701.06082: dumping result to json 43681 1727204701.06090: done dumping result, returning 43681 1727204701.06099: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-9e86-7728-0000000001d1] 43681 1727204701.06106: sending task result for task 12b410aa-8751-9e86-7728-0000000001d1 43681 1727204701.06215: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d1 43681 1727204701.06220: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.020614", "end": "2024-09-24 15:05:01.009834", "rc": 0, "start": "2024-09-24 15:05:00.989220" } 43681 1727204701.06299: no more pending results, returning what we have 43681 1727204701.06304: results queue empty 43681 1727204701.06305: checking for any_errors_fatal 43681 1727204701.06323: done checking for any_errors_fatal 43681 1727204701.06324: checking for max_fail_percentage 43681 1727204701.06326: done checking for max_fail_percentage 43681 1727204701.06327: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.06328: done checking to see if all hosts have failed 43681 1727204701.06329: getting the remaining hosts for this loop 43681 1727204701.06331: done getting the remaining hosts for this loop 43681 1727204701.06336: getting the next task for host managed-node3 43681 1727204701.06341: done getting next task for host managed-node3 43681 1727204701.06344: ^ task is: TASK: Delete veth interface {{ interface }} 43681 1727204701.06347: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.06352: getting variables 43681 1727204701.06354: in VariableManager get_vars() 43681 1727204701.06398: Calling all_inventory to load vars for managed-node3 43681 1727204701.06401: Calling groups_inventory to load vars for managed-node3 43681 1727204701.06404: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.06415: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.06420: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.06424: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.06609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.06798: done with get_vars() 43681 1727204701.06809: done getting variables 43681 1727204701.06863: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204701.06969: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.390) 0:00:08.736 ***** 43681 1727204701.06997: entering _queue_task() for managed-node3/command 43681 1727204701.07253: worker is 1 (out of 1 available) 43681 1727204701.07269: exiting _queue_task() for managed-node3/command 43681 1727204701.07283: done queuing things up, now waiting for results queue to drain 43681 1727204701.07285: waiting for pending results... 43681 1727204701.07464: running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 43681 1727204701.07556: in run() - task 12b410aa-8751-9e86-7728-0000000001d2 43681 1727204701.07569: variable 'ansible_search_path' from source: unknown 43681 1727204701.07573: variable 'ansible_search_path' from source: unknown 43681 1727204701.07610: calling self._execute() 43681 1727204701.07684: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.07692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.07702: variable 'omit' from source: magic vars 43681 1727204701.08037: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.08048: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.08231: variable 'type' from source: set_fact 43681 1727204701.08235: variable 'state' from source: include params 43681 1727204701.08241: variable 'interface' from source: set_fact 43681 1727204701.08246: variable 'current_interfaces' from source: set_fact 43681 1727204701.08255: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 43681 1727204701.08257: when evaluation is False, skipping this task 43681 1727204701.08260: _execute() done 43681 1727204701.08265: dumping result to json 43681 1727204701.08269: done dumping result, returning 43681 1727204701.08278: done running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d2] 43681 1727204701.08288: sending task result for task 12b410aa-8751-9e86-7728-0000000001d2 43681 1727204701.08381: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d2 43681 1727204701.08384: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 43681 1727204701.08444: no more pending results, returning what we have 43681 1727204701.08449: results queue empty 43681 1727204701.08450: checking for any_errors_fatal 43681 1727204701.08461: done checking for any_errors_fatal 43681 1727204701.08462: checking for max_fail_percentage 43681 1727204701.08464: done checking for max_fail_percentage 43681 1727204701.08465: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.08466: done checking to see if all hosts have failed 43681 1727204701.08466: getting the remaining hosts for this loop 43681 1727204701.08468: done getting the remaining hosts for this loop 43681 1727204701.08473: getting the next task for host managed-node3 43681 1727204701.08479: done getting next task for host managed-node3 43681 1727204701.08482: ^ task is: TASK: Create dummy interface {{ interface }} 43681 1727204701.08486: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.08492: getting variables 43681 1727204701.08493: in VariableManager get_vars() 43681 1727204701.08531: Calling all_inventory to load vars for managed-node3 43681 1727204701.08534: Calling groups_inventory to load vars for managed-node3 43681 1727204701.08537: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.08548: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.08551: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.08554: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.08740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.08969: done with get_vars() 43681 1727204701.08979: done getting variables 43681 1727204701.09032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204701.09126: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.021) 0:00:08.758 ***** 43681 1727204701.09154: entering _queue_task() for managed-node3/command 43681 1727204701.09392: worker is 1 (out of 1 available) 43681 1727204701.09407: exiting _queue_task() for managed-node3/command 43681 1727204701.09420: done queuing things up, now waiting for results queue to drain 43681 1727204701.09422: waiting for pending results... 43681 1727204701.09603: running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 43681 1727204701.09692: in run() - task 12b410aa-8751-9e86-7728-0000000001d3 43681 1727204701.09705: variable 'ansible_search_path' from source: unknown 43681 1727204701.09709: variable 'ansible_search_path' from source: unknown 43681 1727204701.09744: calling self._execute() 43681 1727204701.09818: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.09827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.09837: variable 'omit' from source: magic vars 43681 1727204701.10173: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.10184: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.10367: variable 'type' from source: set_fact 43681 1727204701.10371: variable 'state' from source: include params 43681 1727204701.10377: variable 'interface' from source: set_fact 43681 1727204701.10381: variable 'current_interfaces' from source: set_fact 43681 1727204701.10393: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 43681 1727204701.10396: when evaluation is False, skipping this task 43681 1727204701.10399: _execute() done 43681 1727204701.10401: dumping result to json 43681 1727204701.10406: done dumping result, returning 43681 1727204701.10414: done running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d3] 43681 1727204701.10428: sending task result for task 12b410aa-8751-9e86-7728-0000000001d3 43681 1727204701.10519: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d3 43681 1727204701.10522: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 43681 1727204701.10577: no more pending results, returning what we have 43681 1727204701.10581: results queue empty 43681 1727204701.10582: checking for any_errors_fatal 43681 1727204701.10591: done checking for any_errors_fatal 43681 1727204701.10592: checking for max_fail_percentage 43681 1727204701.10594: done checking for max_fail_percentage 43681 1727204701.10595: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.10596: done checking to see if all hosts have failed 43681 1727204701.10597: getting the remaining hosts for this loop 43681 1727204701.10598: done getting the remaining hosts for this loop 43681 1727204701.10604: getting the next task for host managed-node3 43681 1727204701.10610: done getting next task for host managed-node3 43681 1727204701.10613: ^ task is: TASK: Delete dummy interface {{ interface }} 43681 1727204701.10616: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.10620: getting variables 43681 1727204701.10622: in VariableManager get_vars() 43681 1727204701.10658: Calling all_inventory to load vars for managed-node3 43681 1727204701.10661: Calling groups_inventory to load vars for managed-node3 43681 1727204701.10664: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.10675: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.10677: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.10681: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.10868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.11058: done with get_vars() 43681 1727204701.11068: done getting variables 43681 1727204701.11119: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204701.11215: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.020) 0:00:08.779 ***** 43681 1727204701.11243: entering _queue_task() for managed-node3/command 43681 1727204701.11475: worker is 1 (out of 1 available) 43681 1727204701.11492: exiting _queue_task() for managed-node3/command 43681 1727204701.11508: done queuing things up, now waiting for results queue to drain 43681 1727204701.11510: waiting for pending results... 43681 1727204701.11681: running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 43681 1727204701.11761: in run() - task 12b410aa-8751-9e86-7728-0000000001d4 43681 1727204701.11774: variable 'ansible_search_path' from source: unknown 43681 1727204701.11778: variable 'ansible_search_path' from source: unknown 43681 1727204701.11813: calling self._execute() 43681 1727204701.11892: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.11900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.11909: variable 'omit' from source: magic vars 43681 1727204701.12233: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.12244: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.12425: variable 'type' from source: set_fact 43681 1727204701.12430: variable 'state' from source: include params 43681 1727204701.12436: variable 'interface' from source: set_fact 43681 1727204701.12441: variable 'current_interfaces' from source: set_fact 43681 1727204701.12450: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 43681 1727204701.12452: when evaluation is False, skipping this task 43681 1727204701.12455: _execute() done 43681 1727204701.12460: dumping result to json 43681 1727204701.12465: done dumping result, returning 43681 1727204701.12472: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d4] 43681 1727204701.12478: sending task result for task 12b410aa-8751-9e86-7728-0000000001d4 43681 1727204701.12575: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d4 43681 1727204701.12578: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 43681 1727204701.12654: no more pending results, returning what we have 43681 1727204701.12659: results queue empty 43681 1727204701.12660: checking for any_errors_fatal 43681 1727204701.12665: done checking for any_errors_fatal 43681 1727204701.12667: checking for max_fail_percentage 43681 1727204701.12668: done checking for max_fail_percentage 43681 1727204701.12669: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.12670: done checking to see if all hosts have failed 43681 1727204701.12671: getting the remaining hosts for this loop 43681 1727204701.12673: done getting the remaining hosts for this loop 43681 1727204701.12677: getting the next task for host managed-node3 43681 1727204701.12682: done getting next task for host managed-node3 43681 1727204701.12685: ^ task is: TASK: Create tap interface {{ interface }} 43681 1727204701.12691: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.12695: getting variables 43681 1727204701.12696: in VariableManager get_vars() 43681 1727204701.12731: Calling all_inventory to load vars for managed-node3 43681 1727204701.12734: Calling groups_inventory to load vars for managed-node3 43681 1727204701.12736: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.12749: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.12752: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.12756: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.12966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.13150: done with get_vars() 43681 1727204701.13159: done getting variables 43681 1727204701.13210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204701.13303: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.020) 0:00:08.799 ***** 43681 1727204701.13330: entering _queue_task() for managed-node3/command 43681 1727204701.13558: worker is 1 (out of 1 available) 43681 1727204701.13576: exiting _queue_task() for managed-node3/command 43681 1727204701.13591: done queuing things up, now waiting for results queue to drain 43681 1727204701.13593: waiting for pending results... 43681 1727204701.13767: running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 43681 1727204701.13849: in run() - task 12b410aa-8751-9e86-7728-0000000001d5 43681 1727204701.13861: variable 'ansible_search_path' from source: unknown 43681 1727204701.13865: variable 'ansible_search_path' from source: unknown 43681 1727204701.13900: calling self._execute() 43681 1727204701.13975: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.13981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.13996: variable 'omit' from source: magic vars 43681 1727204701.14313: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.14326: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.14508: variable 'type' from source: set_fact 43681 1727204701.14515: variable 'state' from source: include params 43681 1727204701.14522: variable 'interface' from source: set_fact 43681 1727204701.14528: variable 'current_interfaces' from source: set_fact 43681 1727204701.14536: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 43681 1727204701.14540: when evaluation is False, skipping this task 43681 1727204701.14543: _execute() done 43681 1727204701.14545: dumping result to json 43681 1727204701.14551: done dumping result, returning 43681 1727204701.14557: done running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d5] 43681 1727204701.14564: sending task result for task 12b410aa-8751-9e86-7728-0000000001d5 43681 1727204701.14656: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d5 43681 1727204701.14659: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 43681 1727204701.14741: no more pending results, returning what we have 43681 1727204701.14745: results queue empty 43681 1727204701.14745: checking for any_errors_fatal 43681 1727204701.14752: done checking for any_errors_fatal 43681 1727204701.14753: checking for max_fail_percentage 43681 1727204701.14755: done checking for max_fail_percentage 43681 1727204701.14756: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.14757: done checking to see if all hosts have failed 43681 1727204701.14758: getting the remaining hosts for this loop 43681 1727204701.14759: done getting the remaining hosts for this loop 43681 1727204701.14763: getting the next task for host managed-node3 43681 1727204701.14771: done getting next task for host managed-node3 43681 1727204701.14773: ^ task is: TASK: Delete tap interface {{ interface }} 43681 1727204701.14776: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.14780: getting variables 43681 1727204701.14782: in VariableManager get_vars() 43681 1727204701.14818: Calling all_inventory to load vars for managed-node3 43681 1727204701.14821: Calling groups_inventory to load vars for managed-node3 43681 1727204701.14824: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.14835: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.14838: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.14841: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.15011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.15200: done with get_vars() 43681 1727204701.15211: done getting variables 43681 1727204701.15260: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204701.15356: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.020) 0:00:08.820 ***** 43681 1727204701.15381: entering _queue_task() for managed-node3/command 43681 1727204701.15622: worker is 1 (out of 1 available) 43681 1727204701.15638: exiting _queue_task() for managed-node3/command 43681 1727204701.15652: done queuing things up, now waiting for results queue to drain 43681 1727204701.15654: waiting for pending results... 43681 1727204701.15827: running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 43681 1727204701.15911: in run() - task 12b410aa-8751-9e86-7728-0000000001d6 43681 1727204701.15927: variable 'ansible_search_path' from source: unknown 43681 1727204701.15931: variable 'ansible_search_path' from source: unknown 43681 1727204701.15965: calling self._execute() 43681 1727204701.16046: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.16053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.16064: variable 'omit' from source: magic vars 43681 1727204701.16454: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.16465: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.16640: variable 'type' from source: set_fact 43681 1727204701.16644: variable 'state' from source: include params 43681 1727204701.16649: variable 'interface' from source: set_fact 43681 1727204701.16652: variable 'current_interfaces' from source: set_fact 43681 1727204701.16664: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 43681 1727204701.16667: when evaluation is False, skipping this task 43681 1727204701.16670: _execute() done 43681 1727204701.16672: dumping result to json 43681 1727204701.16677: done dumping result, returning 43681 1727204701.16684: done running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 [12b410aa-8751-9e86-7728-0000000001d6] 43681 1727204701.16692: sending task result for task 12b410aa-8751-9e86-7728-0000000001d6 43681 1727204701.16785: done sending task result for task 12b410aa-8751-9e86-7728-0000000001d6 43681 1727204701.16788: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 43681 1727204701.16839: no more pending results, returning what we have 43681 1727204701.16844: results queue empty 43681 1727204701.16846: checking for any_errors_fatal 43681 1727204701.16852: done checking for any_errors_fatal 43681 1727204701.16853: checking for max_fail_percentage 43681 1727204701.16855: done checking for max_fail_percentage 43681 1727204701.16856: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.16857: done checking to see if all hosts have failed 43681 1727204701.16858: getting the remaining hosts for this loop 43681 1727204701.16859: done getting the remaining hosts for this loop 43681 1727204701.16864: getting the next task for host managed-node3 43681 1727204701.16872: done getting next task for host managed-node3 43681 1727204701.16877: ^ task is: TASK: Include the task 'assert_device_present.yml' 43681 1727204701.16880: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.16884: getting variables 43681 1727204701.16886: in VariableManager get_vars() 43681 1727204701.16925: Calling all_inventory to load vars for managed-node3 43681 1727204701.16928: Calling groups_inventory to load vars for managed-node3 43681 1727204701.16931: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.16942: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.16944: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.16948: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.17172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.17354: done with get_vars() 43681 1727204701.17363: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.020) 0:00:08.841 ***** 43681 1727204701.17443: entering _queue_task() for managed-node3/include_tasks 43681 1727204701.17673: worker is 1 (out of 1 available) 43681 1727204701.17690: exiting _queue_task() for managed-node3/include_tasks 43681 1727204701.17705: done queuing things up, now waiting for results queue to drain 43681 1727204701.17707: waiting for pending results... 43681 1727204701.17880: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 43681 1727204701.17957: in run() - task 12b410aa-8751-9e86-7728-00000000000e 43681 1727204701.17970: variable 'ansible_search_path' from source: unknown 43681 1727204701.18004: calling self._execute() 43681 1727204701.18082: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.18092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.18101: variable 'omit' from source: magic vars 43681 1727204701.18432: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.18444: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.18451: _execute() done 43681 1727204701.18454: dumping result to json 43681 1727204701.18459: done dumping result, returning 43681 1727204701.18466: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [12b410aa-8751-9e86-7728-00000000000e] 43681 1727204701.18472: sending task result for task 12b410aa-8751-9e86-7728-00000000000e 43681 1727204701.18574: done sending task result for task 12b410aa-8751-9e86-7728-00000000000e 43681 1727204701.18577: WORKER PROCESS EXITING 43681 1727204701.18618: no more pending results, returning what we have 43681 1727204701.18624: in VariableManager get_vars() 43681 1727204701.18669: Calling all_inventory to load vars for managed-node3 43681 1727204701.18672: Calling groups_inventory to load vars for managed-node3 43681 1727204701.18674: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.18687: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.18693: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.18697: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.18891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.19090: done with get_vars() 43681 1727204701.19097: variable 'ansible_search_path' from source: unknown 43681 1727204701.19109: we have included files to process 43681 1727204701.19110: generating all_blocks data 43681 1727204701.19111: done generating all_blocks data 43681 1727204701.19116: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 43681 1727204701.19117: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 43681 1727204701.19120: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 43681 1727204701.19251: in VariableManager get_vars() 43681 1727204701.19267: done with get_vars() 43681 1727204701.19359: done processing included file 43681 1727204701.19361: iterating over new_blocks loaded from include file 43681 1727204701.19362: in VariableManager get_vars() 43681 1727204701.19373: done with get_vars() 43681 1727204701.19374: filtering new block on tags 43681 1727204701.19391: done filtering new block on tags 43681 1727204701.19393: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 43681 1727204701.19397: extending task lists for all hosts with included blocks 43681 1727204701.20736: done extending task lists 43681 1727204701.20738: done processing included files 43681 1727204701.20739: results queue empty 43681 1727204701.20739: checking for any_errors_fatal 43681 1727204701.20742: done checking for any_errors_fatal 43681 1727204701.20743: checking for max_fail_percentage 43681 1727204701.20744: done checking for max_fail_percentage 43681 1727204701.20744: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.20745: done checking to see if all hosts have failed 43681 1727204701.20746: getting the remaining hosts for this loop 43681 1727204701.20748: done getting the remaining hosts for this loop 43681 1727204701.20750: getting the next task for host managed-node3 43681 1727204701.20754: done getting next task for host managed-node3 43681 1727204701.20755: ^ task is: TASK: Include the task 'get_interface_stat.yml' 43681 1727204701.20758: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.20760: getting variables 43681 1727204701.20760: in VariableManager get_vars() 43681 1727204701.20771: Calling all_inventory to load vars for managed-node3 43681 1727204701.20773: Calling groups_inventory to load vars for managed-node3 43681 1727204701.20774: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.20780: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.20782: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.20784: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.20915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.21092: done with get_vars() 43681 1727204701.21101: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.037) 0:00:08.878 ***** 43681 1727204701.21164: entering _queue_task() for managed-node3/include_tasks 43681 1727204701.21426: worker is 1 (out of 1 available) 43681 1727204701.21440: exiting _queue_task() for managed-node3/include_tasks 43681 1727204701.21455: done queuing things up, now waiting for results queue to drain 43681 1727204701.21457: waiting for pending results... 43681 1727204701.21641: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 43681 1727204701.21714: in run() - task 12b410aa-8751-9e86-7728-0000000002ec 43681 1727204701.21728: variable 'ansible_search_path' from source: unknown 43681 1727204701.21732: variable 'ansible_search_path' from source: unknown 43681 1727204701.21763: calling self._execute() 43681 1727204701.21844: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.21851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.21861: variable 'omit' from source: magic vars 43681 1727204701.22194: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.22205: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.22213: _execute() done 43681 1727204701.22216: dumping result to json 43681 1727204701.22223: done dumping result, returning 43681 1727204701.22235: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9e86-7728-0000000002ec] 43681 1727204701.22239: sending task result for task 12b410aa-8751-9e86-7728-0000000002ec 43681 1727204701.22330: done sending task result for task 12b410aa-8751-9e86-7728-0000000002ec 43681 1727204701.22332: WORKER PROCESS EXITING 43681 1727204701.22368: no more pending results, returning what we have 43681 1727204701.22374: in VariableManager get_vars() 43681 1727204701.22424: Calling all_inventory to load vars for managed-node3 43681 1727204701.22427: Calling groups_inventory to load vars for managed-node3 43681 1727204701.22430: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.22445: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.22448: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.22452: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.22684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.22865: done with get_vars() 43681 1727204701.22872: variable 'ansible_search_path' from source: unknown 43681 1727204701.22873: variable 'ansible_search_path' from source: unknown 43681 1727204701.22904: we have included files to process 43681 1727204701.22905: generating all_blocks data 43681 1727204701.22906: done generating all_blocks data 43681 1727204701.22907: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204701.22908: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204701.22909: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204701.23101: done processing included file 43681 1727204701.23102: iterating over new_blocks loaded from include file 43681 1727204701.23104: in VariableManager get_vars() 43681 1727204701.23116: done with get_vars() 43681 1727204701.23118: filtering new block on tags 43681 1727204701.23132: done filtering new block on tags 43681 1727204701.23134: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 43681 1727204701.23138: extending task lists for all hosts with included blocks 43681 1727204701.23221: done extending task lists 43681 1727204701.23222: done processing included files 43681 1727204701.23223: results queue empty 43681 1727204701.23223: checking for any_errors_fatal 43681 1727204701.23227: done checking for any_errors_fatal 43681 1727204701.23227: checking for max_fail_percentage 43681 1727204701.23228: done checking for max_fail_percentage 43681 1727204701.23229: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.23230: done checking to see if all hosts have failed 43681 1727204701.23230: getting the remaining hosts for this loop 43681 1727204701.23231: done getting the remaining hosts for this loop 43681 1727204701.23233: getting the next task for host managed-node3 43681 1727204701.23236: done getting next task for host managed-node3 43681 1727204701.23238: ^ task is: TASK: Get stat for interface {{ interface }} 43681 1727204701.23240: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.23242: getting variables 43681 1727204701.23243: in VariableManager get_vars() 43681 1727204701.23251: Calling all_inventory to load vars for managed-node3 43681 1727204701.23253: Calling groups_inventory to load vars for managed-node3 43681 1727204701.23255: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.23261: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.23264: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.23266: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.23394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.23586: done with get_vars() 43681 1727204701.23597: done getting variables 43681 1727204701.23725: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.025) 0:00:08.904 ***** 43681 1727204701.23749: entering _queue_task() for managed-node3/stat 43681 1727204701.23994: worker is 1 (out of 1 available) 43681 1727204701.24010: exiting _queue_task() for managed-node3/stat 43681 1727204701.24023: done queuing things up, now waiting for results queue to drain 43681 1727204701.24025: waiting for pending results... 43681 1727204701.24197: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 43681 1727204701.24285: in run() - task 12b410aa-8751-9e86-7728-0000000003b5 43681 1727204701.24300: variable 'ansible_search_path' from source: unknown 43681 1727204701.24303: variable 'ansible_search_path' from source: unknown 43681 1727204701.24337: calling self._execute() 43681 1727204701.24411: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.24418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.24431: variable 'omit' from source: magic vars 43681 1727204701.24746: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.24757: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.24764: variable 'omit' from source: magic vars 43681 1727204701.24809: variable 'omit' from source: magic vars 43681 1727204701.24892: variable 'interface' from source: set_fact 43681 1727204701.24909: variable 'omit' from source: magic vars 43681 1727204701.24948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204701.24981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204701.25001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204701.25021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.25035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.25063: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204701.25066: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.25071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.25159: Set connection var ansible_shell_type to sh 43681 1727204701.25165: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204701.25172: Set connection var ansible_timeout to 10 43681 1727204701.25181: Set connection var ansible_pipelining to False 43681 1727204701.25187: Set connection var ansible_connection to ssh 43681 1727204701.25195: Set connection var ansible_shell_executable to /bin/sh 43681 1727204701.25215: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.25218: variable 'ansible_connection' from source: unknown 43681 1727204701.25224: variable 'ansible_module_compression' from source: unknown 43681 1727204701.25229: variable 'ansible_shell_type' from source: unknown 43681 1727204701.25232: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.25234: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.25247: variable 'ansible_pipelining' from source: unknown 43681 1727204701.25250: variable 'ansible_timeout' from source: unknown 43681 1727204701.25252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.25426: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204701.25437: variable 'omit' from source: magic vars 43681 1727204701.25443: starting attempt loop 43681 1727204701.25446: running the handler 43681 1727204701.25461: _low_level_execute_command(): starting 43681 1727204701.25473: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204701.26023: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.26027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.26031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204701.26035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.26092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.26096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.26103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.26142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.27913: stdout chunk (state=3): >>>/root <<< 43681 1727204701.28025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.28085: stderr chunk (state=3): >>><<< 43681 1727204701.28088: stdout chunk (state=3): >>><<< 43681 1727204701.28120: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.28131: _low_level_execute_command(): starting 43681 1727204701.28140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884 `" && echo ansible-tmp-1727204701.281173-44097-77065559115884="` echo /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884 `" ) && sleep 0' 43681 1727204701.28616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.28621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204701.28624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.28635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.28686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.28692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.28739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.30749: stdout chunk (state=3): >>>ansible-tmp-1727204701.281173-44097-77065559115884=/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884 <<< 43681 1727204701.30864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.30928: stderr chunk (state=3): >>><<< 43681 1727204701.30932: stdout chunk (state=3): >>><<< 43681 1727204701.30951: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204701.281173-44097-77065559115884=/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.30998: variable 'ansible_module_compression' from source: unknown 43681 1727204701.31049: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 43681 1727204701.31082: variable 'ansible_facts' from source: unknown 43681 1727204701.31149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py 43681 1727204701.31435: Sending initial data 43681 1727204701.31438: Sent initial data (151 bytes) 43681 1727204701.31924: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204701.31935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.31947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.32004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.32062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.32075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.32094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.32164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.33810: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204701.33846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204701.33881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmplb5rgfk7 /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py <<< 43681 1727204701.33895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py" <<< 43681 1727204701.33914: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmplb5rgfk7" to remote "/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py" <<< 43681 1727204701.33923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py" <<< 43681 1727204701.34684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.34853: stderr chunk (state=3): >>><<< 43681 1727204701.34856: stdout chunk (state=3): >>><<< 43681 1727204701.34859: done transferring module to remote 43681 1727204701.34862: _low_level_execute_command(): starting 43681 1727204701.34864: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/ /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py && sleep 0' 43681 1727204701.35500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204701.35509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.35594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.35599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204701.35602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204701.35653: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.35673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.35687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.35706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.35770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.37737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.37827: stderr chunk (state=3): >>><<< 43681 1727204701.37837: stdout chunk (state=3): >>><<< 43681 1727204701.37861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.37870: _low_level_execute_command(): starting 43681 1727204701.37880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/AnsiballZ_stat.py && sleep 0' 43681 1727204701.38612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.38717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204701.38734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.38783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.38843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.56430: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39450, "dev": 23, "nlink": 1, "atime": 1727204699.7459548, "mtime": 1727204699.7459548, "ctime": 1727204699.7459548, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 43681 1727204701.58036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204701.58040: stdout chunk (state=3): >>><<< 43681 1727204701.58043: stderr chunk (state=3): >>><<< 43681 1727204701.58067: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39450, "dev": 23, "nlink": 1, "atime": 1727204699.7459548, "mtime": 1727204699.7459548, "ctime": 1727204699.7459548, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204701.58166: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204701.58277: _low_level_execute_command(): starting 43681 1727204701.58281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204701.281173-44097-77065559115884/ > /dev/null 2>&1 && sleep 0' 43681 1727204701.58923: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204701.58952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.59043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.59065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.59115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.59144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.59180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.59249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.61327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.61331: stdout chunk (state=3): >>><<< 43681 1727204701.61334: stderr chunk (state=3): >>><<< 43681 1727204701.61496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.61500: handler run complete 43681 1727204701.61503: attempt loop complete, returning result 43681 1727204701.61505: _execute() done 43681 1727204701.61507: dumping result to json 43681 1727204701.61509: done dumping result, returning 43681 1727204701.61511: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [12b410aa-8751-9e86-7728-0000000003b5] 43681 1727204701.61514: sending task result for task 12b410aa-8751-9e86-7728-0000000003b5 43681 1727204701.61835: done sending task result for task 12b410aa-8751-9e86-7728-0000000003b5 43681 1727204701.61839: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204699.7459548, "block_size": 4096, "blocks": 0, "ctime": 1727204699.7459548, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 39450, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204699.7459548, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 43681 1727204701.61993: no more pending results, returning what we have 43681 1727204701.61998: results queue empty 43681 1727204701.61999: checking for any_errors_fatal 43681 1727204701.62001: done checking for any_errors_fatal 43681 1727204701.62002: checking for max_fail_percentage 43681 1727204701.62004: done checking for max_fail_percentage 43681 1727204701.62005: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.62006: done checking to see if all hosts have failed 43681 1727204701.62007: getting the remaining hosts for this loop 43681 1727204701.62009: done getting the remaining hosts for this loop 43681 1727204701.62014: getting the next task for host managed-node3 43681 1727204701.62026: done getting next task for host managed-node3 43681 1727204701.62030: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 43681 1727204701.62034: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.62041: getting variables 43681 1727204701.62043: in VariableManager get_vars() 43681 1727204701.62201: Calling all_inventory to load vars for managed-node3 43681 1727204701.62204: Calling groups_inventory to load vars for managed-node3 43681 1727204701.62207: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.62222: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.62225: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.62230: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.62611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.63238: done with get_vars() 43681 1727204701.63255: done getting variables 43681 1727204701.63379: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 43681 1727204701.63538: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.398) 0:00:09.302 ***** 43681 1727204701.63574: entering _queue_task() for managed-node3/assert 43681 1727204701.63576: Creating lock for assert 43681 1727204701.63950: worker is 1 (out of 1 available) 43681 1727204701.64079: exiting _queue_task() for managed-node3/assert 43681 1727204701.64094: done queuing things up, now waiting for results queue to drain 43681 1727204701.64096: waiting for pending results... 43681 1727204701.64413: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' 43681 1727204701.64508: in run() - task 12b410aa-8751-9e86-7728-0000000002ed 43681 1727204701.64512: variable 'ansible_search_path' from source: unknown 43681 1727204701.64515: variable 'ansible_search_path' from source: unknown 43681 1727204701.64521: calling self._execute() 43681 1727204701.64620: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.64639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.64658: variable 'omit' from source: magic vars 43681 1727204701.65235: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.65254: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.65271: variable 'omit' from source: magic vars 43681 1727204701.65379: variable 'omit' from source: magic vars 43681 1727204701.65461: variable 'interface' from source: set_fact 43681 1727204701.65494: variable 'omit' from source: magic vars 43681 1727204701.65551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204701.65603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204701.65637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204701.65662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.65680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.65796: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204701.65799: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.65803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.65886: Set connection var ansible_shell_type to sh 43681 1727204701.65901: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204701.65913: Set connection var ansible_timeout to 10 43681 1727204701.65939: Set connection var ansible_pipelining to False 43681 1727204701.65952: Set connection var ansible_connection to ssh 43681 1727204701.65964: Set connection var ansible_shell_executable to /bin/sh 43681 1727204701.65996: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.66005: variable 'ansible_connection' from source: unknown 43681 1727204701.66013: variable 'ansible_module_compression' from source: unknown 43681 1727204701.66023: variable 'ansible_shell_type' from source: unknown 43681 1727204701.66034: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.66042: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.66056: variable 'ansible_pipelining' from source: unknown 43681 1727204701.66139: variable 'ansible_timeout' from source: unknown 43681 1727204701.66143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.66250: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204701.66277: variable 'omit' from source: magic vars 43681 1727204701.66288: starting attempt loop 43681 1727204701.66298: running the handler 43681 1727204701.66495: variable 'interface_stat' from source: set_fact 43681 1727204701.66585: Evaluated conditional (interface_stat.stat.exists): True 43681 1727204701.66589: handler run complete 43681 1727204701.66593: attempt loop complete, returning result 43681 1727204701.66595: _execute() done 43681 1727204701.66598: dumping result to json 43681 1727204701.66600: done dumping result, returning 43681 1727204701.66602: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' [12b410aa-8751-9e86-7728-0000000002ed] 43681 1727204701.66605: sending task result for task 12b410aa-8751-9e86-7728-0000000002ed 43681 1727204701.66774: done sending task result for task 12b410aa-8751-9e86-7728-0000000002ed 43681 1727204701.66778: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204701.66861: no more pending results, returning what we have 43681 1727204701.66867: results queue empty 43681 1727204701.66868: checking for any_errors_fatal 43681 1727204701.66879: done checking for any_errors_fatal 43681 1727204701.66880: checking for max_fail_percentage 43681 1727204701.66883: done checking for max_fail_percentage 43681 1727204701.66883: checking to see if all hosts have failed and the running result is not ok 43681 1727204701.66885: done checking to see if all hosts have failed 43681 1727204701.66886: getting the remaining hosts for this loop 43681 1727204701.66887: done getting the remaining hosts for this loop 43681 1727204701.66894: getting the next task for host managed-node3 43681 1727204701.66903: done getting next task for host managed-node3 43681 1727204701.66907: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 43681 1727204701.66910: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204701.66915: getting variables 43681 1727204701.66919: in VariableManager get_vars() 43681 1727204701.66965: Calling all_inventory to load vars for managed-node3 43681 1727204701.66968: Calling groups_inventory to load vars for managed-node3 43681 1727204701.66971: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204701.66983: Calling all_plugins_play to load vars for managed-node3 43681 1727204701.66988: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204701.67211: Calling groups_plugins_play to load vars for managed-node3 43681 1727204701.67572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204701.67918: done with get_vars() 43681 1727204701.67932: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.044) 0:00:09.347 ***** 43681 1727204701.68047: entering _queue_task() for managed-node3/lineinfile 43681 1727204701.68049: Creating lock for lineinfile 43681 1727204701.68380: worker is 1 (out of 1 available) 43681 1727204701.68522: exiting _queue_task() for managed-node3/lineinfile 43681 1727204701.68536: done queuing things up, now waiting for results queue to drain 43681 1727204701.68538: waiting for pending results... 43681 1727204701.68777: running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 43681 1727204701.68954: in run() - task 12b410aa-8751-9e86-7728-00000000000f 43681 1727204701.68959: variable 'ansible_search_path' from source: unknown 43681 1727204701.68963: calling self._execute() 43681 1727204701.69027: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.69042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.69066: variable 'omit' from source: magic vars 43681 1727204701.69570: variable 'ansible_distribution_major_version' from source: facts 43681 1727204701.69595: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204701.69622: variable 'omit' from source: magic vars 43681 1727204701.69651: variable 'omit' from source: magic vars 43681 1727204701.69706: variable 'omit' from source: magic vars 43681 1727204701.69827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204701.69833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204701.69936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204701.69940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.69945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204701.69960: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204701.69971: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.69981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.70130: Set connection var ansible_shell_type to sh 43681 1727204701.70154: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204701.70263: Set connection var ansible_timeout to 10 43681 1727204701.70267: Set connection var ansible_pipelining to False 43681 1727204701.70269: Set connection var ansible_connection to ssh 43681 1727204701.70274: Set connection var ansible_shell_executable to /bin/sh 43681 1727204701.70276: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.70279: variable 'ansible_connection' from source: unknown 43681 1727204701.70281: variable 'ansible_module_compression' from source: unknown 43681 1727204701.70283: variable 'ansible_shell_type' from source: unknown 43681 1727204701.70286: variable 'ansible_shell_executable' from source: unknown 43681 1727204701.70288: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204701.70292: variable 'ansible_pipelining' from source: unknown 43681 1727204701.70294: variable 'ansible_timeout' from source: unknown 43681 1727204701.70297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204701.70571: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204701.70630: variable 'omit' from source: magic vars 43681 1727204701.70634: starting attempt loop 43681 1727204701.70638: running the handler 43681 1727204701.70644: _low_level_execute_command(): starting 43681 1727204701.70659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204701.71595: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.71632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.71651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.71685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.71767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.73510: stdout chunk (state=3): >>>/root <<< 43681 1727204701.73716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.73739: stderr chunk (state=3): >>><<< 43681 1727204701.73751: stdout chunk (state=3): >>><<< 43681 1727204701.73787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.73819: _low_level_execute_command(): starting 43681 1727204701.73841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691 `" && echo ansible-tmp-1727204701.7380385-44109-177876587600691="` echo /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691 `" ) && sleep 0' 43681 1727204701.74525: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204701.74542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204701.74564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.74585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204701.74606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204701.74619: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204701.74674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.74750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.74787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.74833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.74863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.76924: stdout chunk (state=3): >>>ansible-tmp-1727204701.7380385-44109-177876587600691=/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691 <<< 43681 1727204701.77029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.77104: stderr chunk (state=3): >>><<< 43681 1727204701.77116: stdout chunk (state=3): >>><<< 43681 1727204701.77297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204701.7380385-44109-177876587600691=/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204701.77301: variable 'ansible_module_compression' from source: unknown 43681 1727204701.77304: ANSIBALLZ: Using lock for lineinfile 43681 1727204701.77307: ANSIBALLZ: Acquiring lock 43681 1727204701.77309: ANSIBALLZ: Lock acquired: 140156135582144 43681 1727204701.77311: ANSIBALLZ: Creating module 43681 1727204701.94345: ANSIBALLZ: Writing module into payload 43681 1727204701.94453: ANSIBALLZ: Writing module 43681 1727204701.94475: ANSIBALLZ: Renaming module 43681 1727204701.94482: ANSIBALLZ: Done creating module 43681 1727204701.94503: variable 'ansible_facts' from source: unknown 43681 1727204701.94552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py 43681 1727204701.94680: Sending initial data 43681 1727204701.94684: Sent initial data (159 bytes) 43681 1727204701.95172: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.95176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.95179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.95181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.95247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204701.95254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.95293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204701.97007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204701.97041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204701.97075: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp5co_obd9 /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py <<< 43681 1727204701.97079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py" <<< 43681 1727204701.97109: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp5co_obd9" to remote "/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py" <<< 43681 1727204701.97114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py" <<< 43681 1727204701.97906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204701.97986: stderr chunk (state=3): >>><<< 43681 1727204701.97992: stdout chunk (state=3): >>><<< 43681 1727204701.98013: done transferring module to remote 43681 1727204701.98026: _low_level_execute_command(): starting 43681 1727204701.98031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/ /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py && sleep 0' 43681 1727204701.98528: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.98531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.98534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204701.98536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204701.98603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204701.98613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204701.98650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.00581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.00644: stderr chunk (state=3): >>><<< 43681 1727204702.00648: stdout chunk (state=3): >>><<< 43681 1727204702.00663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204702.00666: _low_level_execute_command(): starting 43681 1727204702.00672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/AnsiballZ_lineinfile.py && sleep 0' 43681 1727204702.01154: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.01158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.01161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.01163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.01220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.01227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.01275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.20216: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 43681 1727204702.21733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204702.21799: stderr chunk (state=3): >>><<< 43681 1727204702.21803: stdout chunk (state=3): >>><<< 43681 1727204702.21823: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204702.21866: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204702.21877: _low_level_execute_command(): starting 43681 1727204702.21883: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204701.7380385-44109-177876587600691/ > /dev/null 2>&1 && sleep 0' 43681 1727204702.22349: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204702.22383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204702.22386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204702.22399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.22401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.22404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.22461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.22464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204702.22468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.22510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.24459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.24519: stderr chunk (state=3): >>><<< 43681 1727204702.24523: stdout chunk (state=3): >>><<< 43681 1727204702.24542: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204702.24550: handler run complete 43681 1727204702.24577: attempt loop complete, returning result 43681 1727204702.24581: _execute() done 43681 1727204702.24583: dumping result to json 43681 1727204702.24593: done dumping result, returning 43681 1727204702.24602: done running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [12b410aa-8751-9e86-7728-00000000000f] 43681 1727204702.24608: sending task result for task 12b410aa-8751-9e86-7728-00000000000f 43681 1727204702.24728: done sending task result for task 12b410aa-8751-9e86-7728-00000000000f 43681 1727204702.24731: WORKER PROCESS EXITING changed: [managed-node3] => { "backup": "", "changed": true } MSG: line added 43681 1727204702.24816: no more pending results, returning what we have 43681 1727204702.24820: results queue empty 43681 1727204702.24821: checking for any_errors_fatal 43681 1727204702.24827: done checking for any_errors_fatal 43681 1727204702.24828: checking for max_fail_percentage 43681 1727204702.24830: done checking for max_fail_percentage 43681 1727204702.24831: checking to see if all hosts have failed and the running result is not ok 43681 1727204702.24832: done checking to see if all hosts have failed 43681 1727204702.24833: getting the remaining hosts for this loop 43681 1727204702.24834: done getting the remaining hosts for this loop 43681 1727204702.24839: getting the next task for host managed-node3 43681 1727204702.24845: done getting next task for host managed-node3 43681 1727204702.24852: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204702.24855: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204702.24871: getting variables 43681 1727204702.24873: in VariableManager get_vars() 43681 1727204702.24920: Calling all_inventory to load vars for managed-node3 43681 1727204702.24924: Calling groups_inventory to load vars for managed-node3 43681 1727204702.24926: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.24937: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.24940: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.24943: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.25136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.25354: done with get_vars() 43681 1727204702.25364: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.573) 0:00:09.921 ***** 43681 1727204702.25447: entering _queue_task() for managed-node3/include_tasks 43681 1727204702.25679: worker is 1 (out of 1 available) 43681 1727204702.25696: exiting _queue_task() for managed-node3/include_tasks 43681 1727204702.25711: done queuing things up, now waiting for results queue to drain 43681 1727204702.25713: waiting for pending results... 43681 1727204702.25894: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204702.26000: in run() - task 12b410aa-8751-9e86-7728-000000000017 43681 1727204702.26012: variable 'ansible_search_path' from source: unknown 43681 1727204702.26016: variable 'ansible_search_path' from source: unknown 43681 1727204702.26051: calling self._execute() 43681 1727204702.26124: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.26131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.26141: variable 'omit' from source: magic vars 43681 1727204702.26460: variable 'ansible_distribution_major_version' from source: facts 43681 1727204702.26471: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204702.26480: _execute() done 43681 1727204702.26484: dumping result to json 43681 1727204702.26487: done dumping result, returning 43681 1727204702.26501: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9e86-7728-000000000017] 43681 1727204702.26504: sending task result for task 12b410aa-8751-9e86-7728-000000000017 43681 1727204702.26599: done sending task result for task 12b410aa-8751-9e86-7728-000000000017 43681 1727204702.26604: WORKER PROCESS EXITING 43681 1727204702.26650: no more pending results, returning what we have 43681 1727204702.26655: in VariableManager get_vars() 43681 1727204702.26699: Calling all_inventory to load vars for managed-node3 43681 1727204702.26703: Calling groups_inventory to load vars for managed-node3 43681 1727204702.26705: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.26716: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.26719: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.26723: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.26908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.27088: done with get_vars() 43681 1727204702.27097: variable 'ansible_search_path' from source: unknown 43681 1727204702.27097: variable 'ansible_search_path' from source: unknown 43681 1727204702.27132: we have included files to process 43681 1727204702.27133: generating all_blocks data 43681 1727204702.27135: done generating all_blocks data 43681 1727204702.27139: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204702.27140: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204702.27142: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204702.27745: done processing included file 43681 1727204702.27746: iterating over new_blocks loaded from include file 43681 1727204702.27747: in VariableManager get_vars() 43681 1727204702.27768: done with get_vars() 43681 1727204702.27770: filtering new block on tags 43681 1727204702.27784: done filtering new block on tags 43681 1727204702.27786: in VariableManager get_vars() 43681 1727204702.27803: done with get_vars() 43681 1727204702.27804: filtering new block on tags 43681 1727204702.27821: done filtering new block on tags 43681 1727204702.27822: in VariableManager get_vars() 43681 1727204702.27838: done with get_vars() 43681 1727204702.27839: filtering new block on tags 43681 1727204702.27852: done filtering new block on tags 43681 1727204702.27854: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 43681 1727204702.27858: extending task lists for all hosts with included blocks 43681 1727204702.28485: done extending task lists 43681 1727204702.28486: done processing included files 43681 1727204702.28487: results queue empty 43681 1727204702.28488: checking for any_errors_fatal 43681 1727204702.28494: done checking for any_errors_fatal 43681 1727204702.28495: checking for max_fail_percentage 43681 1727204702.28496: done checking for max_fail_percentage 43681 1727204702.28496: checking to see if all hosts have failed and the running result is not ok 43681 1727204702.28497: done checking to see if all hosts have failed 43681 1727204702.28498: getting the remaining hosts for this loop 43681 1727204702.28499: done getting the remaining hosts for this loop 43681 1727204702.28501: getting the next task for host managed-node3 43681 1727204702.28504: done getting next task for host managed-node3 43681 1727204702.28506: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204702.28509: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204702.28517: getting variables 43681 1727204702.28518: in VariableManager get_vars() 43681 1727204702.28538: Calling all_inventory to load vars for managed-node3 43681 1727204702.28540: Calling groups_inventory to load vars for managed-node3 43681 1727204702.28542: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.28547: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.28549: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.28551: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.28679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.28880: done with get_vars() 43681 1727204702.28890: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.035) 0:00:09.956 ***** 43681 1727204702.28950: entering _queue_task() for managed-node3/setup 43681 1727204702.29206: worker is 1 (out of 1 available) 43681 1727204702.29221: exiting _queue_task() for managed-node3/setup 43681 1727204702.29236: done queuing things up, now waiting for results queue to drain 43681 1727204702.29238: waiting for pending results... 43681 1727204702.29423: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204702.29542: in run() - task 12b410aa-8751-9e86-7728-0000000003d0 43681 1727204702.29556: variable 'ansible_search_path' from source: unknown 43681 1727204702.29560: variable 'ansible_search_path' from source: unknown 43681 1727204702.29602: calling self._execute() 43681 1727204702.29665: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.29670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.29686: variable 'omit' from source: magic vars 43681 1727204702.30001: variable 'ansible_distribution_major_version' from source: facts 43681 1727204702.30017: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204702.30204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204702.32162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204702.32218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204702.32263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204702.32293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204702.32320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204702.32393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204702.32423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204702.32446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204702.32479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204702.32493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204702.32545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204702.32565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204702.32585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204702.32618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204702.32635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204702.32770: variable '__network_required_facts' from source: role '' defaults 43681 1727204702.32778: variable 'ansible_facts' from source: unknown 43681 1727204702.32856: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 43681 1727204702.32862: when evaluation is False, skipping this task 43681 1727204702.32865: _execute() done 43681 1727204702.32868: dumping result to json 43681 1727204702.32870: done dumping result, returning 43681 1727204702.32881: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9e86-7728-0000000003d0] 43681 1727204702.32886: sending task result for task 12b410aa-8751-9e86-7728-0000000003d0 43681 1727204702.32982: done sending task result for task 12b410aa-8751-9e86-7728-0000000003d0 43681 1727204702.32985: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204702.33036: no more pending results, returning what we have 43681 1727204702.33040: results queue empty 43681 1727204702.33041: checking for any_errors_fatal 43681 1727204702.33042: done checking for any_errors_fatal 43681 1727204702.33043: checking for max_fail_percentage 43681 1727204702.33044: done checking for max_fail_percentage 43681 1727204702.33046: checking to see if all hosts have failed and the running result is not ok 43681 1727204702.33047: done checking to see if all hosts have failed 43681 1727204702.33047: getting the remaining hosts for this loop 43681 1727204702.33049: done getting the remaining hosts for this loop 43681 1727204702.33053: getting the next task for host managed-node3 43681 1727204702.33064: done getting next task for host managed-node3 43681 1727204702.33069: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204702.33073: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204702.33088: getting variables 43681 1727204702.33092: in VariableManager get_vars() 43681 1727204702.33135: Calling all_inventory to load vars for managed-node3 43681 1727204702.33138: Calling groups_inventory to load vars for managed-node3 43681 1727204702.33141: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.33153: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.33156: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.33159: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.33356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.33548: done with get_vars() 43681 1727204702.33559: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.046) 0:00:10.003 ***** 43681 1727204702.33648: entering _queue_task() for managed-node3/stat 43681 1727204702.33871: worker is 1 (out of 1 available) 43681 1727204702.33887: exiting _queue_task() for managed-node3/stat 43681 1727204702.33902: done queuing things up, now waiting for results queue to drain 43681 1727204702.33904: waiting for pending results... 43681 1727204702.34086: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204702.34206: in run() - task 12b410aa-8751-9e86-7728-0000000003d2 43681 1727204702.34219: variable 'ansible_search_path' from source: unknown 43681 1727204702.34224: variable 'ansible_search_path' from source: unknown 43681 1727204702.34260: calling self._execute() 43681 1727204702.34330: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.34336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.34350: variable 'omit' from source: magic vars 43681 1727204702.34670: variable 'ansible_distribution_major_version' from source: facts 43681 1727204702.34681: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204702.34829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204702.35353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204702.35393: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204702.35423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204702.35455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204702.35538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204702.35561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204702.35587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204702.35611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204702.35692: variable '__network_is_ostree' from source: set_fact 43681 1727204702.35699: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204702.35702: when evaluation is False, skipping this task 43681 1727204702.35707: _execute() done 43681 1727204702.35709: dumping result to json 43681 1727204702.35714: done dumping result, returning 43681 1727204702.35725: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9e86-7728-0000000003d2] 43681 1727204702.35731: sending task result for task 12b410aa-8751-9e86-7728-0000000003d2 43681 1727204702.35827: done sending task result for task 12b410aa-8751-9e86-7728-0000000003d2 43681 1727204702.35830: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204702.35887: no more pending results, returning what we have 43681 1727204702.35893: results queue empty 43681 1727204702.35894: checking for any_errors_fatal 43681 1727204702.35901: done checking for any_errors_fatal 43681 1727204702.35901: checking for max_fail_percentage 43681 1727204702.35903: done checking for max_fail_percentage 43681 1727204702.35905: checking to see if all hosts have failed and the running result is not ok 43681 1727204702.35906: done checking to see if all hosts have failed 43681 1727204702.35907: getting the remaining hosts for this loop 43681 1727204702.35909: done getting the remaining hosts for this loop 43681 1727204702.35913: getting the next task for host managed-node3 43681 1727204702.35919: done getting next task for host managed-node3 43681 1727204702.35924: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204702.35928: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204702.35945: getting variables 43681 1727204702.35947: in VariableManager get_vars() 43681 1727204702.35983: Calling all_inventory to load vars for managed-node3 43681 1727204702.35986: Calling groups_inventory to load vars for managed-node3 43681 1727204702.35996: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.36006: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.36009: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.36012: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.36413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.36599: done with get_vars() 43681 1727204702.36607: done getting variables 43681 1727204702.36658: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.030) 0:00:10.033 ***** 43681 1727204702.36686: entering _queue_task() for managed-node3/set_fact 43681 1727204702.36922: worker is 1 (out of 1 available) 43681 1727204702.36939: exiting _queue_task() for managed-node3/set_fact 43681 1727204702.36954: done queuing things up, now waiting for results queue to drain 43681 1727204702.36956: waiting for pending results... 43681 1727204702.37140: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204702.37272: in run() - task 12b410aa-8751-9e86-7728-0000000003d3 43681 1727204702.37286: variable 'ansible_search_path' from source: unknown 43681 1727204702.37290: variable 'ansible_search_path' from source: unknown 43681 1727204702.37326: calling self._execute() 43681 1727204702.37394: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.37401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.37414: variable 'omit' from source: magic vars 43681 1727204702.37796: variable 'ansible_distribution_major_version' from source: facts 43681 1727204702.37800: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204702.37907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204702.38145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204702.38187: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204702.38219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204702.38251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204702.38330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204702.38352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204702.38373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204702.38402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204702.38478: variable '__network_is_ostree' from source: set_fact 43681 1727204702.38485: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204702.38488: when evaluation is False, skipping this task 43681 1727204702.38493: _execute() done 43681 1727204702.38496: dumping result to json 43681 1727204702.38504: done dumping result, returning 43681 1727204702.38514: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9e86-7728-0000000003d3] 43681 1727204702.38517: sending task result for task 12b410aa-8751-9e86-7728-0000000003d3 43681 1727204702.38610: done sending task result for task 12b410aa-8751-9e86-7728-0000000003d3 43681 1727204702.38613: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204702.38667: no more pending results, returning what we have 43681 1727204702.38672: results queue empty 43681 1727204702.38673: checking for any_errors_fatal 43681 1727204702.38681: done checking for any_errors_fatal 43681 1727204702.38682: checking for max_fail_percentage 43681 1727204702.38684: done checking for max_fail_percentage 43681 1727204702.38685: checking to see if all hosts have failed and the running result is not ok 43681 1727204702.38686: done checking to see if all hosts have failed 43681 1727204702.38687: getting the remaining hosts for this loop 43681 1727204702.38688: done getting the remaining hosts for this loop 43681 1727204702.38695: getting the next task for host managed-node3 43681 1727204702.38705: done getting next task for host managed-node3 43681 1727204702.38709: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204702.38713: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204702.38729: getting variables 43681 1727204702.38731: in VariableManager get_vars() 43681 1727204702.38770: Calling all_inventory to load vars for managed-node3 43681 1727204702.38773: Calling groups_inventory to load vars for managed-node3 43681 1727204702.38776: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204702.38786: Calling all_plugins_play to load vars for managed-node3 43681 1727204702.38795: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204702.38800: Calling groups_plugins_play to load vars for managed-node3 43681 1727204702.38977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204702.39178: done with get_vars() 43681 1727204702.39188: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.025) 0:00:10.059 ***** 43681 1727204702.39270: entering _queue_task() for managed-node3/service_facts 43681 1727204702.39272: Creating lock for service_facts 43681 1727204702.39507: worker is 1 (out of 1 available) 43681 1727204702.39522: exiting _queue_task() for managed-node3/service_facts 43681 1727204702.39536: done queuing things up, now waiting for results queue to drain 43681 1727204702.39538: waiting for pending results... 43681 1727204702.39727: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204702.39838: in run() - task 12b410aa-8751-9e86-7728-0000000003d5 43681 1727204702.39851: variable 'ansible_search_path' from source: unknown 43681 1727204702.39855: variable 'ansible_search_path' from source: unknown 43681 1727204702.39886: calling self._execute() 43681 1727204702.39966: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.39973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.39985: variable 'omit' from source: magic vars 43681 1727204702.40378: variable 'ansible_distribution_major_version' from source: facts 43681 1727204702.40390: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204702.40398: variable 'omit' from source: magic vars 43681 1727204702.40462: variable 'omit' from source: magic vars 43681 1727204702.40492: variable 'omit' from source: magic vars 43681 1727204702.40532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204702.40568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204702.40586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204702.40605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204702.40615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204702.40644: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204702.40649: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.40652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.40736: Set connection var ansible_shell_type to sh 43681 1727204702.40743: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204702.40750: Set connection var ansible_timeout to 10 43681 1727204702.40760: Set connection var ansible_pipelining to False 43681 1727204702.40773: Set connection var ansible_connection to ssh 43681 1727204702.40776: Set connection var ansible_shell_executable to /bin/sh 43681 1727204702.40797: variable 'ansible_shell_executable' from source: unknown 43681 1727204702.40800: variable 'ansible_connection' from source: unknown 43681 1727204702.40804: variable 'ansible_module_compression' from source: unknown 43681 1727204702.40807: variable 'ansible_shell_type' from source: unknown 43681 1727204702.40812: variable 'ansible_shell_executable' from source: unknown 43681 1727204702.40818: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204702.40821: variable 'ansible_pipelining' from source: unknown 43681 1727204702.40824: variable 'ansible_timeout' from source: unknown 43681 1727204702.40829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204702.40996: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204702.41006: variable 'omit' from source: magic vars 43681 1727204702.41012: starting attempt loop 43681 1727204702.41015: running the handler 43681 1727204702.41030: _low_level_execute_command(): starting 43681 1727204702.41037: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204702.41596: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.41600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.41604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204702.41607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.41660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.41663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.41718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.43474: stdout chunk (state=3): >>>/root <<< 43681 1727204702.43575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.43643: stderr chunk (state=3): >>><<< 43681 1727204702.43648: stdout chunk (state=3): >>><<< 43681 1727204702.43670: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204702.43682: _low_level_execute_command(): starting 43681 1727204702.43691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062 `" && echo ansible-tmp-1727204702.4366972-44137-93665935327062="` echo /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062 `" ) && sleep 0' 43681 1727204702.44182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.44185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.44188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.44210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.44249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.44253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.44300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.46305: stdout chunk (state=3): >>>ansible-tmp-1727204702.4366972-44137-93665935327062=/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062 <<< 43681 1727204702.46413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.46480: stderr chunk (state=3): >>><<< 43681 1727204702.46483: stdout chunk (state=3): >>><<< 43681 1727204702.46502: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204702.4366972-44137-93665935327062=/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204702.46550: variable 'ansible_module_compression' from source: unknown 43681 1727204702.46597: ANSIBALLZ: Using lock for service_facts 43681 1727204702.46600: ANSIBALLZ: Acquiring lock 43681 1727204702.46603: ANSIBALLZ: Lock acquired: 140156136500032 43681 1727204702.46607: ANSIBALLZ: Creating module 43681 1727204702.58137: ANSIBALLZ: Writing module into payload 43681 1727204702.58225: ANSIBALLZ: Writing module 43681 1727204702.58246: ANSIBALLZ: Renaming module 43681 1727204702.58252: ANSIBALLZ: Done creating module 43681 1727204702.58269: variable 'ansible_facts' from source: unknown 43681 1727204702.58326: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py 43681 1727204702.58442: Sending initial data 43681 1727204702.58445: Sent initial data (161 bytes) 43681 1727204702.58935: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.58939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.58942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.58944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.59004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.59012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204702.59015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.59059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.60769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204702.60806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204702.60848: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpq7ft1rxt /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py <<< 43681 1727204702.60851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py" <<< 43681 1727204702.60875: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpq7ft1rxt" to remote "/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py" <<< 43681 1727204702.60886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py" <<< 43681 1727204702.61677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.61754: stderr chunk (state=3): >>><<< 43681 1727204702.61758: stdout chunk (state=3): >>><<< 43681 1727204702.61778: done transferring module to remote 43681 1727204702.61792: _low_level_execute_command(): starting 43681 1727204702.61800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/ /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py && sleep 0' 43681 1727204702.62288: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.62292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.62295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204702.62303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.62357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204702.62361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.62406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204702.64263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204702.64318: stderr chunk (state=3): >>><<< 43681 1727204702.64322: stdout chunk (state=3): >>><<< 43681 1727204702.64342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204702.64348: _low_level_execute_command(): starting 43681 1727204702.64355: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/AnsiballZ_service_facts.py && sleep 0' 43681 1727204702.64837: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.64840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.64843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204702.64845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204702.64895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204702.64919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204702.64952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204704.66519: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "<<< 43681 1727204704.66538: stdout chunk (state=3): >>>source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit<<< 43681 1727204704.66545: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "syst<<< 43681 1727204704.66571: stdout chunk (state=3): >>>emd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "st<<< 43681 1727204704.66583: stdout chunk (state=3): >>>atus": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 43681 1727204704.68266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204704.68332: stderr chunk (state=3): >>><<< 43681 1727204704.68336: stdout chunk (state=3): >>><<< 43681 1727204704.68360: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204704.68983: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204704.68995: _low_level_execute_command(): starting 43681 1727204704.69001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204702.4366972-44137-93665935327062/ > /dev/null 2>&1 && sleep 0' 43681 1727204704.69500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204704.69503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.69506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204704.69514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.69570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204704.69574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204704.69578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204704.69623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204704.71618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204704.71676: stderr chunk (state=3): >>><<< 43681 1727204704.71679: stdout chunk (state=3): >>><<< 43681 1727204704.71697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204704.71705: handler run complete 43681 1727204704.71880: variable 'ansible_facts' from source: unknown 43681 1727204704.72020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204704.72447: variable 'ansible_facts' from source: unknown 43681 1727204704.72566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204704.72765: attempt loop complete, returning result 43681 1727204704.72772: _execute() done 43681 1727204704.72775: dumping result to json 43681 1727204704.72821: done dumping result, returning 43681 1727204704.72831: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9e86-7728-0000000003d5] 43681 1727204704.72839: sending task result for task 12b410aa-8751-9e86-7728-0000000003d5 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204704.73647: no more pending results, returning what we have 43681 1727204704.73650: results queue empty 43681 1727204704.73650: checking for any_errors_fatal 43681 1727204704.73653: done checking for any_errors_fatal 43681 1727204704.73653: checking for max_fail_percentage 43681 1727204704.73654: done checking for max_fail_percentage 43681 1727204704.73655: checking to see if all hosts have failed and the running result is not ok 43681 1727204704.73656: done checking to see if all hosts have failed 43681 1727204704.73656: getting the remaining hosts for this loop 43681 1727204704.73657: done getting the remaining hosts for this loop 43681 1727204704.73660: getting the next task for host managed-node3 43681 1727204704.73664: done getting next task for host managed-node3 43681 1727204704.73667: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204704.73670: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204704.73680: done sending task result for task 12b410aa-8751-9e86-7728-0000000003d5 43681 1727204704.73683: WORKER PROCESS EXITING 43681 1727204704.73688: getting variables 43681 1727204704.73691: in VariableManager get_vars() 43681 1727204704.73725: Calling all_inventory to load vars for managed-node3 43681 1727204704.73727: Calling groups_inventory to load vars for managed-node3 43681 1727204704.73729: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204704.73737: Calling all_plugins_play to load vars for managed-node3 43681 1727204704.73739: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204704.73741: Calling groups_plugins_play to load vars for managed-node3 43681 1727204704.74084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204704.74554: done with get_vars() 43681 1727204704.74567: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:04 -0400 (0:00:02.353) 0:00:12.413 ***** 43681 1727204704.74652: entering _queue_task() for managed-node3/package_facts 43681 1727204704.74657: Creating lock for package_facts 43681 1727204704.74912: worker is 1 (out of 1 available) 43681 1727204704.74930: exiting _queue_task() for managed-node3/package_facts 43681 1727204704.74944: done queuing things up, now waiting for results queue to drain 43681 1727204704.74946: waiting for pending results... 43681 1727204704.75136: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204704.75250: in run() - task 12b410aa-8751-9e86-7728-0000000003d6 43681 1727204704.75263: variable 'ansible_search_path' from source: unknown 43681 1727204704.75267: variable 'ansible_search_path' from source: unknown 43681 1727204704.75305: calling self._execute() 43681 1727204704.75374: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204704.75382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204704.75394: variable 'omit' from source: magic vars 43681 1727204704.75724: variable 'ansible_distribution_major_version' from source: facts 43681 1727204704.75734: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204704.75741: variable 'omit' from source: magic vars 43681 1727204704.75801: variable 'omit' from source: magic vars 43681 1727204704.75833: variable 'omit' from source: magic vars 43681 1727204704.75871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204704.75907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204704.75926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204704.75946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204704.75958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204704.75986: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204704.75991: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204704.75996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204704.76082: Set connection var ansible_shell_type to sh 43681 1727204704.76091: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204704.76098: Set connection var ansible_timeout to 10 43681 1727204704.76106: Set connection var ansible_pipelining to False 43681 1727204704.76113: Set connection var ansible_connection to ssh 43681 1727204704.76121: Set connection var ansible_shell_executable to /bin/sh 43681 1727204704.76139: variable 'ansible_shell_executable' from source: unknown 43681 1727204704.76142: variable 'ansible_connection' from source: unknown 43681 1727204704.76145: variable 'ansible_module_compression' from source: unknown 43681 1727204704.76149: variable 'ansible_shell_type' from source: unknown 43681 1727204704.76152: variable 'ansible_shell_executable' from source: unknown 43681 1727204704.76159: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204704.76161: variable 'ansible_pipelining' from source: unknown 43681 1727204704.76167: variable 'ansible_timeout' from source: unknown 43681 1727204704.76170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204704.76343: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204704.76354: variable 'omit' from source: magic vars 43681 1727204704.76360: starting attempt loop 43681 1727204704.76363: running the handler 43681 1727204704.76377: _low_level_execute_command(): starting 43681 1727204704.76385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204704.76940: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204704.76944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.76947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204704.76949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.76998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204704.77010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204704.77053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204704.78762: stdout chunk (state=3): >>>/root <<< 43681 1727204704.78868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204704.78934: stderr chunk (state=3): >>><<< 43681 1727204704.78938: stdout chunk (state=3): >>><<< 43681 1727204704.78961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204704.78973: _low_level_execute_command(): starting 43681 1727204704.78981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847 `" && echo ansible-tmp-1727204704.7896056-44167-193232438002847="` echo /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847 `" ) && sleep 0' 43681 1727204704.79472: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204704.79476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.79478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204704.79490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204704.79540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204704.79550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204704.79584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204704.81575: stdout chunk (state=3): >>>ansible-tmp-1727204704.7896056-44167-193232438002847=/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847 <<< 43681 1727204704.81686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204704.81748: stderr chunk (state=3): >>><<< 43681 1727204704.81752: stdout chunk (state=3): >>><<< 43681 1727204704.81766: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204704.7896056-44167-193232438002847=/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204704.81820: variable 'ansible_module_compression' from source: unknown 43681 1727204704.81863: ANSIBALLZ: Using lock for package_facts 43681 1727204704.81867: ANSIBALLZ: Acquiring lock 43681 1727204704.81869: ANSIBALLZ: Lock acquired: 140156134555520 43681 1727204704.81874: ANSIBALLZ: Creating module 43681 1727204705.06487: ANSIBALLZ: Writing module into payload 43681 1727204705.06614: ANSIBALLZ: Writing module 43681 1727204705.06644: ANSIBALLZ: Renaming module 43681 1727204705.06651: ANSIBALLZ: Done creating module 43681 1727204705.06685: variable 'ansible_facts' from source: unknown 43681 1727204705.06838: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py 43681 1727204705.06974: Sending initial data 43681 1727204705.06977: Sent initial data (162 bytes) 43681 1727204705.07481: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204705.07484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.07487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204705.07492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.07553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204705.07558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204705.07561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204705.07608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204705.09328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204705.09366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204705.09402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpzbk50rs3 /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py <<< 43681 1727204705.09406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py" <<< 43681 1727204705.09436: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpzbk50rs3" to remote "/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py" <<< 43681 1727204705.09443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py" <<< 43681 1727204705.11121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204705.11202: stderr chunk (state=3): >>><<< 43681 1727204705.11206: stdout chunk (state=3): >>><<< 43681 1727204705.11229: done transferring module to remote 43681 1727204705.11244: _low_level_execute_command(): starting 43681 1727204705.11251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/ /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py && sleep 0' 43681 1727204705.11755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204705.11759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204705.11762: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.11764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204705.11766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.11826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204705.11830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204705.11872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204705.13743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204705.13803: stderr chunk (state=3): >>><<< 43681 1727204705.13807: stdout chunk (state=3): >>><<< 43681 1727204705.13824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204705.13827: _low_level_execute_command(): starting 43681 1727204705.13833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/AnsiballZ_package_facts.py && sleep 0' 43681 1727204705.14462: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.14467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204705.14586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204705.14594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204705.14608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204705.78322: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 43681 1727204705.78357: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 43681 1727204705.78431: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 43681 1727204705.78462: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 43681 1727204705.78544: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 43681 1727204705.78593: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 43681 1727204705.80708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204705.80712: stdout chunk (state=3): >>><<< 43681 1727204705.80715: stderr chunk (state=3): >>><<< 43681 1727204705.80736: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204705.85699: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204705.85796: _low_level_execute_command(): starting 43681 1727204705.85801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204704.7896056-44167-193232438002847/ > /dev/null 2>&1 && sleep 0' 43681 1727204705.86407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204705.86430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204705.86447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204705.86472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204705.86494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204705.86578: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204705.86622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204705.86643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204705.86675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204705.86751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204705.88831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204705.88854: stdout chunk (state=3): >>><<< 43681 1727204705.88873: stderr chunk (state=3): >>><<< 43681 1727204705.88996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204705.89000: handler run complete 43681 1727204705.95447: variable 'ansible_facts' from source: unknown 43681 1727204705.96352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.00713: variable 'ansible_facts' from source: unknown 43681 1727204706.01529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.03149: attempt loop complete, returning result 43681 1727204706.03201: _execute() done 43681 1727204706.03204: dumping result to json 43681 1727204706.03624: done dumping result, returning 43681 1727204706.03632: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9e86-7728-0000000003d6] 43681 1727204706.03635: sending task result for task 12b410aa-8751-9e86-7728-0000000003d6 43681 1727204706.07678: done sending task result for task 12b410aa-8751-9e86-7728-0000000003d6 43681 1727204706.07682: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204706.07781: no more pending results, returning what we have 43681 1727204706.07784: results queue empty 43681 1727204706.07786: checking for any_errors_fatal 43681 1727204706.07793: done checking for any_errors_fatal 43681 1727204706.07794: checking for max_fail_percentage 43681 1727204706.07796: done checking for max_fail_percentage 43681 1727204706.07797: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.07798: done checking to see if all hosts have failed 43681 1727204706.07799: getting the remaining hosts for this loop 43681 1727204706.07800: done getting the remaining hosts for this loop 43681 1727204706.07805: getting the next task for host managed-node3 43681 1727204706.07812: done getting next task for host managed-node3 43681 1727204706.07819: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204706.07822: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.07835: getting variables 43681 1727204706.07837: in VariableManager get_vars() 43681 1727204706.07879: Calling all_inventory to load vars for managed-node3 43681 1727204706.07883: Calling groups_inventory to load vars for managed-node3 43681 1727204706.07886: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.07900: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.07904: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.07908: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.10054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.13297: done with get_vars() 43681 1727204706.13346: done getting variables 43681 1727204706.13435: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:06 -0400 (0:00:01.388) 0:00:13.801 ***** 43681 1727204706.13487: entering _queue_task() for managed-node3/debug 43681 1727204706.13863: worker is 1 (out of 1 available) 43681 1727204706.13881: exiting _queue_task() for managed-node3/debug 43681 1727204706.14101: done queuing things up, now waiting for results queue to drain 43681 1727204706.14103: waiting for pending results... 43681 1727204706.14300: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204706.14385: in run() - task 12b410aa-8751-9e86-7728-000000000018 43681 1727204706.14421: variable 'ansible_search_path' from source: unknown 43681 1727204706.14497: variable 'ansible_search_path' from source: unknown 43681 1727204706.14500: calling self._execute() 43681 1727204706.14593: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.14607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.14624: variable 'omit' from source: magic vars 43681 1727204706.15132: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.15155: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.15169: variable 'omit' from source: magic vars 43681 1727204706.15257: variable 'omit' from source: magic vars 43681 1727204706.15406: variable 'network_provider' from source: set_fact 43681 1727204706.15439: variable 'omit' from source: magic vars 43681 1727204706.15507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204706.15619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204706.15627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204706.15631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204706.15648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204706.15691: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204706.15702: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.15711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.15866: Set connection var ansible_shell_type to sh 43681 1727204706.15880: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204706.15896: Set connection var ansible_timeout to 10 43681 1727204706.15943: Set connection var ansible_pipelining to False 43681 1727204706.15946: Set connection var ansible_connection to ssh 43681 1727204706.15955: Set connection var ansible_shell_executable to /bin/sh 43681 1727204706.15981: variable 'ansible_shell_executable' from source: unknown 43681 1727204706.15993: variable 'ansible_connection' from source: unknown 43681 1727204706.16002: variable 'ansible_module_compression' from source: unknown 43681 1727204706.16052: variable 'ansible_shell_type' from source: unknown 43681 1727204706.16056: variable 'ansible_shell_executable' from source: unknown 43681 1727204706.16063: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.16066: variable 'ansible_pipelining' from source: unknown 43681 1727204706.16068: variable 'ansible_timeout' from source: unknown 43681 1727204706.16070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.16248: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204706.16273: variable 'omit' from source: magic vars 43681 1727204706.16293: starting attempt loop 43681 1727204706.16380: running the handler 43681 1727204706.16388: handler run complete 43681 1727204706.16397: attempt loop complete, returning result 43681 1727204706.16407: _execute() done 43681 1727204706.16415: dumping result to json 43681 1727204706.16426: done dumping result, returning 43681 1727204706.16440: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9e86-7728-000000000018] 43681 1727204706.16452: sending task result for task 12b410aa-8751-9e86-7728-000000000018 ok: [managed-node3] => {} MSG: Using network provider: nm 43681 1727204706.16683: no more pending results, returning what we have 43681 1727204706.16687: results queue empty 43681 1727204706.16691: checking for any_errors_fatal 43681 1727204706.16704: done checking for any_errors_fatal 43681 1727204706.16705: checking for max_fail_percentage 43681 1727204706.16708: done checking for max_fail_percentage 43681 1727204706.16709: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.16710: done checking to see if all hosts have failed 43681 1727204706.16711: getting the remaining hosts for this loop 43681 1727204706.16712: done getting the remaining hosts for this loop 43681 1727204706.16836: getting the next task for host managed-node3 43681 1727204706.16844: done getting next task for host managed-node3 43681 1727204706.16850: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204706.16854: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.16895: done sending task result for task 12b410aa-8751-9e86-7728-000000000018 43681 1727204706.16899: WORKER PROCESS EXITING 43681 1727204706.16909: getting variables 43681 1727204706.16911: in VariableManager get_vars() 43681 1727204706.17074: Calling all_inventory to load vars for managed-node3 43681 1727204706.17077: Calling groups_inventory to load vars for managed-node3 43681 1727204706.17081: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.17093: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.17097: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.17101: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.19570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.23106: done with get_vars() 43681 1727204706.23251: done getting variables 43681 1727204706.23481: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.100) 0:00:13.901 ***** 43681 1727204706.23528: entering _queue_task() for managed-node3/fail 43681 1727204706.24275: worker is 1 (out of 1 available) 43681 1727204706.24291: exiting _queue_task() for managed-node3/fail 43681 1727204706.24308: done queuing things up, now waiting for results queue to drain 43681 1727204706.24395: waiting for pending results... 43681 1727204706.24806: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204706.25598: in run() - task 12b410aa-8751-9e86-7728-000000000019 43681 1727204706.25603: variable 'ansible_search_path' from source: unknown 43681 1727204706.25606: variable 'ansible_search_path' from source: unknown 43681 1727204706.25609: calling self._execute() 43681 1727204706.25897: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.25901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.25904: variable 'omit' from source: magic vars 43681 1727204706.26677: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.26996: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.27081: variable 'network_state' from source: role '' defaults 43681 1727204706.27104: Evaluated conditional (network_state != {}): False 43681 1727204706.27113: when evaluation is False, skipping this task 43681 1727204706.27125: _execute() done 43681 1727204706.27202: dumping result to json 43681 1727204706.27213: done dumping result, returning 43681 1727204706.27232: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9e86-7728-000000000019] 43681 1727204706.27246: sending task result for task 12b410aa-8751-9e86-7728-000000000019 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204706.27527: no more pending results, returning what we have 43681 1727204706.27532: results queue empty 43681 1727204706.27534: checking for any_errors_fatal 43681 1727204706.27541: done checking for any_errors_fatal 43681 1727204706.27542: checking for max_fail_percentage 43681 1727204706.27544: done checking for max_fail_percentage 43681 1727204706.27546: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.27547: done checking to see if all hosts have failed 43681 1727204706.27548: getting the remaining hosts for this loop 43681 1727204706.27550: done getting the remaining hosts for this loop 43681 1727204706.27555: getting the next task for host managed-node3 43681 1727204706.27564: done getting next task for host managed-node3 43681 1727204706.27569: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204706.27573: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.27596: getting variables 43681 1727204706.27599: in VariableManager get_vars() 43681 1727204706.27648: Calling all_inventory to load vars for managed-node3 43681 1727204706.27652: Calling groups_inventory to load vars for managed-node3 43681 1727204706.27655: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.27671: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.27674: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.27679: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.28408: done sending task result for task 12b410aa-8751-9e86-7728-000000000019 43681 1727204706.28411: WORKER PROCESS EXITING 43681 1727204706.30409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.33349: done with get_vars() 43681 1727204706.33398: done getting variables 43681 1727204706.33472: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.099) 0:00:14.001 ***** 43681 1727204706.33515: entering _queue_task() for managed-node3/fail 43681 1727204706.33872: worker is 1 (out of 1 available) 43681 1727204706.33887: exiting _queue_task() for managed-node3/fail 43681 1727204706.34005: done queuing things up, now waiting for results queue to drain 43681 1727204706.34007: waiting for pending results... 43681 1727204706.34216: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204706.34377: in run() - task 12b410aa-8751-9e86-7728-00000000001a 43681 1727204706.34401: variable 'ansible_search_path' from source: unknown 43681 1727204706.34410: variable 'ansible_search_path' from source: unknown 43681 1727204706.34456: calling self._execute() 43681 1727204706.34559: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.34579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.34598: variable 'omit' from source: magic vars 43681 1727204706.35042: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.35060: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.35220: variable 'network_state' from source: role '' defaults 43681 1727204706.35338: Evaluated conditional (network_state != {}): False 43681 1727204706.35342: when evaluation is False, skipping this task 43681 1727204706.35345: _execute() done 43681 1727204706.35348: dumping result to json 43681 1727204706.35350: done dumping result, returning 43681 1727204706.35353: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9e86-7728-00000000001a] 43681 1727204706.35356: sending task result for task 12b410aa-8751-9e86-7728-00000000001a 43681 1727204706.35433: done sending task result for task 12b410aa-8751-9e86-7728-00000000001a 43681 1727204706.35436: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204706.35492: no more pending results, returning what we have 43681 1727204706.35498: results queue empty 43681 1727204706.35499: checking for any_errors_fatal 43681 1727204706.35508: done checking for any_errors_fatal 43681 1727204706.35509: checking for max_fail_percentage 43681 1727204706.35511: done checking for max_fail_percentage 43681 1727204706.35512: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.35513: done checking to see if all hosts have failed 43681 1727204706.35514: getting the remaining hosts for this loop 43681 1727204706.35515: done getting the remaining hosts for this loop 43681 1727204706.35520: getting the next task for host managed-node3 43681 1727204706.35527: done getting next task for host managed-node3 43681 1727204706.35531: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204706.35534: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.35554: getting variables 43681 1727204706.35556: in VariableManager get_vars() 43681 1727204706.35599: Calling all_inventory to load vars for managed-node3 43681 1727204706.35602: Calling groups_inventory to load vars for managed-node3 43681 1727204706.35605: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.35618: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.35623: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.35627: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.38030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.41068: done with get_vars() 43681 1727204706.41112: done getting variables 43681 1727204706.41184: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.077) 0:00:14.078 ***** 43681 1727204706.41227: entering _queue_task() for managed-node3/fail 43681 1727204706.41570: worker is 1 (out of 1 available) 43681 1727204706.41585: exiting _queue_task() for managed-node3/fail 43681 1727204706.41800: done queuing things up, now waiting for results queue to drain 43681 1727204706.41803: waiting for pending results... 43681 1727204706.41934: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204706.42139: in run() - task 12b410aa-8751-9e86-7728-00000000001b 43681 1727204706.42144: variable 'ansible_search_path' from source: unknown 43681 1727204706.42149: variable 'ansible_search_path' from source: unknown 43681 1727204706.42154: calling self._execute() 43681 1727204706.42270: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.42285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.42307: variable 'omit' from source: magic vars 43681 1727204706.42775: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.42801: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.43041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204706.46084: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204706.46397: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204706.46401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204706.46404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204706.46407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204706.46417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.46461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.46499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.46559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.46586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.46717: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.46749: Evaluated conditional (ansible_distribution_major_version | int > 9): True 43681 1727204706.46912: variable 'ansible_distribution' from source: facts 43681 1727204706.46924: variable '__network_rh_distros' from source: role '' defaults 43681 1727204706.46940: Evaluated conditional (ansible_distribution in __network_rh_distros): False 43681 1727204706.46951: when evaluation is False, skipping this task 43681 1727204706.46962: _execute() done 43681 1727204706.47066: dumping result to json 43681 1727204706.47070: done dumping result, returning 43681 1727204706.47074: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9e86-7728-00000000001b] 43681 1727204706.47077: sending task result for task 12b410aa-8751-9e86-7728-00000000001b 43681 1727204706.47157: done sending task result for task 12b410aa-8751-9e86-7728-00000000001b 43681 1727204706.47161: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 43681 1727204706.47226: no more pending results, returning what we have 43681 1727204706.47230: results queue empty 43681 1727204706.47231: checking for any_errors_fatal 43681 1727204706.47238: done checking for any_errors_fatal 43681 1727204706.47239: checking for max_fail_percentage 43681 1727204706.47241: done checking for max_fail_percentage 43681 1727204706.47242: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.47244: done checking to see if all hosts have failed 43681 1727204706.47245: getting the remaining hosts for this loop 43681 1727204706.47246: done getting the remaining hosts for this loop 43681 1727204706.47251: getting the next task for host managed-node3 43681 1727204706.47259: done getting next task for host managed-node3 43681 1727204706.47263: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204706.47267: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.47285: getting variables 43681 1727204706.47288: in VariableManager get_vars() 43681 1727204706.47335: Calling all_inventory to load vars for managed-node3 43681 1727204706.47338: Calling groups_inventory to load vars for managed-node3 43681 1727204706.47342: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.47353: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.47357: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.47361: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.49913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.52844: done with get_vars() 43681 1727204706.52894: done getting variables 43681 1727204706.53016: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.118) 0:00:14.197 ***** 43681 1727204706.53055: entering _queue_task() for managed-node3/dnf 43681 1727204706.53414: worker is 1 (out of 1 available) 43681 1727204706.53431: exiting _queue_task() for managed-node3/dnf 43681 1727204706.53445: done queuing things up, now waiting for results queue to drain 43681 1727204706.53447: waiting for pending results... 43681 1727204706.53821: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204706.53929: in run() - task 12b410aa-8751-9e86-7728-00000000001c 43681 1727204706.53954: variable 'ansible_search_path' from source: unknown 43681 1727204706.53965: variable 'ansible_search_path' from source: unknown 43681 1727204706.54013: calling self._execute() 43681 1727204706.54133: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.54137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.54154: variable 'omit' from source: magic vars 43681 1727204706.54678: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.54682: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.54884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204706.57541: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204706.57643: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204706.57694: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204706.57745: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204706.57780: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204706.57883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.57925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.57968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.58025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.58058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.58277: variable 'ansible_distribution' from source: facts 43681 1727204706.58280: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.58283: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 43681 1727204706.58375: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204706.58566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.58607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.58644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.58699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.58728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.58787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.58829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.58866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.58923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.58951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.59042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.59046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.59086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.59142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.59172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.59396: variable 'network_connections' from source: task vars 43681 1727204706.59410: variable 'interface' from source: set_fact 43681 1727204706.59590: variable 'interface' from source: set_fact 43681 1727204706.59597: variable 'interface' from source: set_fact 43681 1727204706.59605: variable 'interface' from source: set_fact 43681 1727204706.59745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204706.59901: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204706.59946: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204706.59972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204706.60000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204706.60041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204706.60061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204706.60086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.60110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204706.60166: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204706.60384: variable 'network_connections' from source: task vars 43681 1727204706.60391: variable 'interface' from source: set_fact 43681 1727204706.60444: variable 'interface' from source: set_fact 43681 1727204706.60450: variable 'interface' from source: set_fact 43681 1727204706.60505: variable 'interface' from source: set_fact 43681 1727204706.60552: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204706.60556: when evaluation is False, skipping this task 43681 1727204706.60558: _execute() done 43681 1727204706.60561: dumping result to json 43681 1727204706.60572: done dumping result, returning 43681 1727204706.60576: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-00000000001c] 43681 1727204706.60582: sending task result for task 12b410aa-8751-9e86-7728-00000000001c 43681 1727204706.60684: done sending task result for task 12b410aa-8751-9e86-7728-00000000001c 43681 1727204706.60688: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204706.60751: no more pending results, returning what we have 43681 1727204706.60756: results queue empty 43681 1727204706.60757: checking for any_errors_fatal 43681 1727204706.60763: done checking for any_errors_fatal 43681 1727204706.60764: checking for max_fail_percentage 43681 1727204706.60766: done checking for max_fail_percentage 43681 1727204706.60767: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.60768: done checking to see if all hosts have failed 43681 1727204706.60769: getting the remaining hosts for this loop 43681 1727204706.60770: done getting the remaining hosts for this loop 43681 1727204706.60776: getting the next task for host managed-node3 43681 1727204706.60783: done getting next task for host managed-node3 43681 1727204706.60788: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204706.60793: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.60812: getting variables 43681 1727204706.60814: in VariableManager get_vars() 43681 1727204706.60857: Calling all_inventory to load vars for managed-node3 43681 1727204706.60860: Calling groups_inventory to load vars for managed-node3 43681 1727204706.60862: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.60872: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.60875: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.60878: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.62821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.68337: done with get_vars() 43681 1727204706.68371: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204706.68459: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.154) 0:00:14.351 ***** 43681 1727204706.68503: entering _queue_task() for managed-node3/yum 43681 1727204706.68513: Creating lock for yum 43681 1727204706.68936: worker is 1 (out of 1 available) 43681 1727204706.68951: exiting _queue_task() for managed-node3/yum 43681 1727204706.68965: done queuing things up, now waiting for results queue to drain 43681 1727204706.68966: waiting for pending results... 43681 1727204706.69324: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204706.69596: in run() - task 12b410aa-8751-9e86-7728-00000000001d 43681 1727204706.69601: variable 'ansible_search_path' from source: unknown 43681 1727204706.69605: variable 'ansible_search_path' from source: unknown 43681 1727204706.69608: calling self._execute() 43681 1727204706.69684: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.69703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.69727: variable 'omit' from source: magic vars 43681 1727204706.70228: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.70247: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.70494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204706.72848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204706.72912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204706.72948: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204706.72980: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204706.73010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204706.73086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.73113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.73142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.73176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.73197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.73288: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.73305: Evaluated conditional (ansible_distribution_major_version | int < 8): False 43681 1727204706.73309: when evaluation is False, skipping this task 43681 1727204706.73312: _execute() done 43681 1727204706.73315: dumping result to json 43681 1727204706.73321: done dumping result, returning 43681 1727204706.73330: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-00000000001d] 43681 1727204706.73336: sending task result for task 12b410aa-8751-9e86-7728-00000000001d 43681 1727204706.73445: done sending task result for task 12b410aa-8751-9e86-7728-00000000001d 43681 1727204706.73448: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 43681 1727204706.73523: no more pending results, returning what we have 43681 1727204706.73528: results queue empty 43681 1727204706.73529: checking for any_errors_fatal 43681 1727204706.73537: done checking for any_errors_fatal 43681 1727204706.73538: checking for max_fail_percentage 43681 1727204706.73540: done checking for max_fail_percentage 43681 1727204706.73541: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.73542: done checking to see if all hosts have failed 43681 1727204706.73543: getting the remaining hosts for this loop 43681 1727204706.73544: done getting the remaining hosts for this loop 43681 1727204706.73549: getting the next task for host managed-node3 43681 1727204706.73556: done getting next task for host managed-node3 43681 1727204706.73560: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204706.73564: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.73581: getting variables 43681 1727204706.73583: in VariableManager get_vars() 43681 1727204706.73627: Calling all_inventory to load vars for managed-node3 43681 1727204706.73630: Calling groups_inventory to load vars for managed-node3 43681 1727204706.73632: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.73642: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.73645: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.73649: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.75539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.77172: done with get_vars() 43681 1727204706.77202: done getting variables 43681 1727204706.77258: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.087) 0:00:14.439 ***** 43681 1727204706.77287: entering _queue_task() for managed-node3/fail 43681 1727204706.77558: worker is 1 (out of 1 available) 43681 1727204706.77576: exiting _queue_task() for managed-node3/fail 43681 1727204706.77593: done queuing things up, now waiting for results queue to drain 43681 1727204706.77595: waiting for pending results... 43681 1727204706.77798: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204706.77906: in run() - task 12b410aa-8751-9e86-7728-00000000001e 43681 1727204706.77918: variable 'ansible_search_path' from source: unknown 43681 1727204706.77924: variable 'ansible_search_path' from source: unknown 43681 1727204706.77964: calling self._execute() 43681 1727204706.78048: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.78052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.78064: variable 'omit' from source: magic vars 43681 1727204706.78408: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.78422: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.78530: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204706.78706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204706.80817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204706.80869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204706.80915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204706.80948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204706.80970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204706.81046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.81069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.81093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.81132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.81144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.81184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.81212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.81235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.81266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.81279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.81320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.81342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.81362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.81393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.81406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.81553: variable 'network_connections' from source: task vars 43681 1727204706.81565: variable 'interface' from source: set_fact 43681 1727204706.81631: variable 'interface' from source: set_fact 43681 1727204706.81641: variable 'interface' from source: set_fact 43681 1727204706.81694: variable 'interface' from source: set_fact 43681 1727204706.81780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204706.81919: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204706.81954: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204706.81993: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204706.82020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204706.82057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204706.82075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204706.82102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.82128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204706.82179: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204706.82385: variable 'network_connections' from source: task vars 43681 1727204706.82392: variable 'interface' from source: set_fact 43681 1727204706.82449: variable 'interface' from source: set_fact 43681 1727204706.82455: variable 'interface' from source: set_fact 43681 1727204706.82507: variable 'interface' from source: set_fact 43681 1727204706.82559: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204706.82563: when evaluation is False, skipping this task 43681 1727204706.82565: _execute() done 43681 1727204706.82568: dumping result to json 43681 1727204706.82571: done dumping result, returning 43681 1727204706.82580: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-00000000001e] 43681 1727204706.82592: sending task result for task 12b410aa-8751-9e86-7728-00000000001e 43681 1727204706.82686: done sending task result for task 12b410aa-8751-9e86-7728-00000000001e 43681 1727204706.82698: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204706.82753: no more pending results, returning what we have 43681 1727204706.82758: results queue empty 43681 1727204706.82759: checking for any_errors_fatal 43681 1727204706.82768: done checking for any_errors_fatal 43681 1727204706.82768: checking for max_fail_percentage 43681 1727204706.82770: done checking for max_fail_percentage 43681 1727204706.82771: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.82772: done checking to see if all hosts have failed 43681 1727204706.82773: getting the remaining hosts for this loop 43681 1727204706.82775: done getting the remaining hosts for this loop 43681 1727204706.82779: getting the next task for host managed-node3 43681 1727204706.82786: done getting next task for host managed-node3 43681 1727204706.82793: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 43681 1727204706.82796: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.82814: getting variables 43681 1727204706.82816: in VariableManager get_vars() 43681 1727204706.82858: Calling all_inventory to load vars for managed-node3 43681 1727204706.82861: Calling groups_inventory to load vars for managed-node3 43681 1727204706.82864: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.82875: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.82877: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.82883: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.84301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.85910: done with get_vars() 43681 1727204706.85941: done getting variables 43681 1727204706.85997: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.087) 0:00:14.526 ***** 43681 1727204706.86024: entering _queue_task() for managed-node3/package 43681 1727204706.86299: worker is 1 (out of 1 available) 43681 1727204706.86315: exiting _queue_task() for managed-node3/package 43681 1727204706.86329: done queuing things up, now waiting for results queue to drain 43681 1727204706.86331: waiting for pending results... 43681 1727204706.86532: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 43681 1727204706.86661: in run() - task 12b410aa-8751-9e86-7728-00000000001f 43681 1727204706.86677: variable 'ansible_search_path' from source: unknown 43681 1727204706.86681: variable 'ansible_search_path' from source: unknown 43681 1727204706.86720: calling self._execute() 43681 1727204706.86800: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.86807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.86819: variable 'omit' from source: magic vars 43681 1727204706.87152: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.87163: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.87339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204706.87567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204706.87608: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204706.87641: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204706.87705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204706.87805: variable 'network_packages' from source: role '' defaults 43681 1727204706.87898: variable '__network_provider_setup' from source: role '' defaults 43681 1727204706.87909: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204706.87969: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204706.87981: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204706.88036: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204706.88195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204706.89790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204706.89844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204706.89876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204706.89907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204706.89932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204706.90013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.90039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.90060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.90103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.90116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.90157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.90183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.90205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.90239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.90251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.90452: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204706.90555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.90575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.90597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.90636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.90648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.90730: variable 'ansible_python' from source: facts 43681 1727204706.90753: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204706.90826: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204706.90895: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204706.91005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.91028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.91055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.91084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.91099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.91141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204706.91169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204706.91191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.91224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204706.91237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204706.91357: variable 'network_connections' from source: task vars 43681 1727204706.91363: variable 'interface' from source: set_fact 43681 1727204706.91447: variable 'interface' from source: set_fact 43681 1727204706.91456: variable 'interface' from source: set_fact 43681 1727204706.91540: variable 'interface' from source: set_fact 43681 1727204706.91625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204706.91649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204706.91673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204706.91701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204706.91746: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204706.91980: variable 'network_connections' from source: task vars 43681 1727204706.91983: variable 'interface' from source: set_fact 43681 1727204706.92068: variable 'interface' from source: set_fact 43681 1727204706.92077: variable 'interface' from source: set_fact 43681 1727204706.92158: variable 'interface' from source: set_fact 43681 1727204706.92224: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204706.92294: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204706.92546: variable 'network_connections' from source: task vars 43681 1727204706.92549: variable 'interface' from source: set_fact 43681 1727204706.92608: variable 'interface' from source: set_fact 43681 1727204706.92614: variable 'interface' from source: set_fact 43681 1727204706.92670: variable 'interface' from source: set_fact 43681 1727204706.92714: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204706.92779: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204706.93032: variable 'network_connections' from source: task vars 43681 1727204706.93036: variable 'interface' from source: set_fact 43681 1727204706.93091: variable 'interface' from source: set_fact 43681 1727204706.93097: variable 'interface' from source: set_fact 43681 1727204706.93158: variable 'interface' from source: set_fact 43681 1727204706.93231: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204706.93283: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204706.93292: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204706.93345: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204706.93530: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204706.93940: variable 'network_connections' from source: task vars 43681 1727204706.93944: variable 'interface' from source: set_fact 43681 1727204706.93999: variable 'interface' from source: set_fact 43681 1727204706.94009: variable 'interface' from source: set_fact 43681 1727204706.94062: variable 'interface' from source: set_fact 43681 1727204706.94088: variable 'ansible_distribution' from source: facts 43681 1727204706.94093: variable '__network_rh_distros' from source: role '' defaults 43681 1727204706.94101: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.94127: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204706.94265: variable 'ansible_distribution' from source: facts 43681 1727204706.94269: variable '__network_rh_distros' from source: role '' defaults 43681 1727204706.94275: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.94282: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204706.94446: variable 'ansible_distribution' from source: facts 43681 1727204706.94452: variable '__network_rh_distros' from source: role '' defaults 43681 1727204706.94454: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.94464: variable 'network_provider' from source: set_fact 43681 1727204706.94479: variable 'ansible_facts' from source: unknown 43681 1727204706.95277: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 43681 1727204706.95281: when evaluation is False, skipping this task 43681 1727204706.95283: _execute() done 43681 1727204706.95286: dumping result to json 43681 1727204706.95288: done dumping result, returning 43681 1727204706.95299: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9e86-7728-00000000001f] 43681 1727204706.95305: sending task result for task 12b410aa-8751-9e86-7728-00000000001f 43681 1727204706.95409: done sending task result for task 12b410aa-8751-9e86-7728-00000000001f 43681 1727204706.95412: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 43681 1727204706.95476: no more pending results, returning what we have 43681 1727204706.95480: results queue empty 43681 1727204706.95482: checking for any_errors_fatal 43681 1727204706.95492: done checking for any_errors_fatal 43681 1727204706.95493: checking for max_fail_percentage 43681 1727204706.95495: done checking for max_fail_percentage 43681 1727204706.95496: checking to see if all hosts have failed and the running result is not ok 43681 1727204706.95497: done checking to see if all hosts have failed 43681 1727204706.95497: getting the remaining hosts for this loop 43681 1727204706.95499: done getting the remaining hosts for this loop 43681 1727204706.95504: getting the next task for host managed-node3 43681 1727204706.95511: done getting next task for host managed-node3 43681 1727204706.95519: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204706.95522: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204706.95546: getting variables 43681 1727204706.95548: in VariableManager get_vars() 43681 1727204706.95587: Calling all_inventory to load vars for managed-node3 43681 1727204706.95597: Calling groups_inventory to load vars for managed-node3 43681 1727204706.95600: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204706.95612: Calling all_plugins_play to load vars for managed-node3 43681 1727204706.95615: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204706.95621: Calling groups_plugins_play to load vars for managed-node3 43681 1727204706.96909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204706.98640: done with get_vars() 43681 1727204706.98665: done getting variables 43681 1727204706.98722: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:06 -0400 (0:00:00.127) 0:00:14.654 ***** 43681 1727204706.98754: entering _queue_task() for managed-node3/package 43681 1727204706.99033: worker is 1 (out of 1 available) 43681 1727204706.99048: exiting _queue_task() for managed-node3/package 43681 1727204706.99062: done queuing things up, now waiting for results queue to drain 43681 1727204706.99064: waiting for pending results... 43681 1727204706.99259: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204706.99360: in run() - task 12b410aa-8751-9e86-7728-000000000020 43681 1727204706.99373: variable 'ansible_search_path' from source: unknown 43681 1727204706.99376: variable 'ansible_search_path' from source: unknown 43681 1727204706.99415: calling self._execute() 43681 1727204706.99522: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204706.99526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204706.99531: variable 'omit' from source: magic vars 43681 1727204706.99852: variable 'ansible_distribution_major_version' from source: facts 43681 1727204706.99865: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204706.99971: variable 'network_state' from source: role '' defaults 43681 1727204706.99985: Evaluated conditional (network_state != {}): False 43681 1727204706.99988: when evaluation is False, skipping this task 43681 1727204706.99994: _execute() done 43681 1727204707.00064: dumping result to json 43681 1727204707.00067: done dumping result, returning 43681 1727204707.00069: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9e86-7728-000000000020] 43681 1727204707.00072: sending task result for task 12b410aa-8751-9e86-7728-000000000020 43681 1727204707.00148: done sending task result for task 12b410aa-8751-9e86-7728-000000000020 43681 1727204707.00151: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204707.00221: no more pending results, returning what we have 43681 1727204707.00226: results queue empty 43681 1727204707.00227: checking for any_errors_fatal 43681 1727204707.00233: done checking for any_errors_fatal 43681 1727204707.00234: checking for max_fail_percentage 43681 1727204707.00235: done checking for max_fail_percentage 43681 1727204707.00236: checking to see if all hosts have failed and the running result is not ok 43681 1727204707.00237: done checking to see if all hosts have failed 43681 1727204707.00238: getting the remaining hosts for this loop 43681 1727204707.00240: done getting the remaining hosts for this loop 43681 1727204707.00244: getting the next task for host managed-node3 43681 1727204707.00251: done getting next task for host managed-node3 43681 1727204707.00255: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204707.00259: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204707.00276: getting variables 43681 1727204707.00278: in VariableManager get_vars() 43681 1727204707.00315: Calling all_inventory to load vars for managed-node3 43681 1727204707.00318: Calling groups_inventory to load vars for managed-node3 43681 1727204707.00321: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204707.00331: Calling all_plugins_play to load vars for managed-node3 43681 1727204707.00334: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204707.00337: Calling groups_plugins_play to load vars for managed-node3 43681 1727204707.01573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204707.03166: done with get_vars() 43681 1727204707.03201: done getting variables 43681 1727204707.03255: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.045) 0:00:14.699 ***** 43681 1727204707.03284: entering _queue_task() for managed-node3/package 43681 1727204707.03554: worker is 1 (out of 1 available) 43681 1727204707.03570: exiting _queue_task() for managed-node3/package 43681 1727204707.03585: done queuing things up, now waiting for results queue to drain 43681 1727204707.03588: waiting for pending results... 43681 1727204707.03797: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204707.03912: in run() - task 12b410aa-8751-9e86-7728-000000000021 43681 1727204707.03938: variable 'ansible_search_path' from source: unknown 43681 1727204707.03942: variable 'ansible_search_path' from source: unknown 43681 1727204707.03968: calling self._execute() 43681 1727204707.04050: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204707.04058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204707.04068: variable 'omit' from source: magic vars 43681 1727204707.04402: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.04414: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204707.04523: variable 'network_state' from source: role '' defaults 43681 1727204707.04534: Evaluated conditional (network_state != {}): False 43681 1727204707.04538: when evaluation is False, skipping this task 43681 1727204707.04541: _execute() done 43681 1727204707.04545: dumping result to json 43681 1727204707.04549: done dumping result, returning 43681 1727204707.04558: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9e86-7728-000000000021] 43681 1727204707.04564: sending task result for task 12b410aa-8751-9e86-7728-000000000021 43681 1727204707.04668: done sending task result for task 12b410aa-8751-9e86-7728-000000000021 43681 1727204707.04671: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204707.04737: no more pending results, returning what we have 43681 1727204707.04742: results queue empty 43681 1727204707.04743: checking for any_errors_fatal 43681 1727204707.04752: done checking for any_errors_fatal 43681 1727204707.04752: checking for max_fail_percentage 43681 1727204707.04754: done checking for max_fail_percentage 43681 1727204707.04755: checking to see if all hosts have failed and the running result is not ok 43681 1727204707.04756: done checking to see if all hosts have failed 43681 1727204707.04758: getting the remaining hosts for this loop 43681 1727204707.04759: done getting the remaining hosts for this loop 43681 1727204707.04763: getting the next task for host managed-node3 43681 1727204707.04769: done getting next task for host managed-node3 43681 1727204707.04773: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204707.04776: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204707.04795: getting variables 43681 1727204707.04797: in VariableManager get_vars() 43681 1727204707.04835: Calling all_inventory to load vars for managed-node3 43681 1727204707.04838: Calling groups_inventory to load vars for managed-node3 43681 1727204707.04840: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204707.04850: Calling all_plugins_play to load vars for managed-node3 43681 1727204707.04853: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204707.04857: Calling groups_plugins_play to load vars for managed-node3 43681 1727204707.06213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204707.07795: done with get_vars() 43681 1727204707.07822: done getting variables 43681 1727204707.07912: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.046) 0:00:14.746 ***** 43681 1727204707.07943: entering _queue_task() for managed-node3/service 43681 1727204707.07945: Creating lock for service 43681 1727204707.08214: worker is 1 (out of 1 available) 43681 1727204707.08230: exiting _queue_task() for managed-node3/service 43681 1727204707.08245: done queuing things up, now waiting for results queue to drain 43681 1727204707.08248: waiting for pending results... 43681 1727204707.08503: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204707.08605: in run() - task 12b410aa-8751-9e86-7728-000000000022 43681 1727204707.08625: variable 'ansible_search_path' from source: unknown 43681 1727204707.08628: variable 'ansible_search_path' from source: unknown 43681 1727204707.08660: calling self._execute() 43681 1727204707.08746: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204707.08755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204707.08765: variable 'omit' from source: magic vars 43681 1727204707.09100: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.09111: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204707.09217: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204707.09396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204707.11691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204707.11885: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204707.11889: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204707.11893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204707.11912: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204707.12000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.12019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.12041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.12074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.12086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.12137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.12156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.12176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.12215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.12231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.12266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.12285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.12309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.12345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.12358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.12507: variable 'network_connections' from source: task vars 43681 1727204707.12522: variable 'interface' from source: set_fact 43681 1727204707.12586: variable 'interface' from source: set_fact 43681 1727204707.12596: variable 'interface' from source: set_fact 43681 1727204707.12649: variable 'interface' from source: set_fact 43681 1727204707.12733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204707.12883: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204707.12919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204707.12945: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204707.12974: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204707.13013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204707.13034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204707.13055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.13077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204707.13135: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204707.13340: variable 'network_connections' from source: task vars 43681 1727204707.13345: variable 'interface' from source: set_fact 43681 1727204707.13397: variable 'interface' from source: set_fact 43681 1727204707.13406: variable 'interface' from source: set_fact 43681 1727204707.13458: variable 'interface' from source: set_fact 43681 1727204707.13506: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204707.13511: when evaluation is False, skipping this task 43681 1727204707.13514: _execute() done 43681 1727204707.13521: dumping result to json 43681 1727204707.13523: done dumping result, returning 43681 1727204707.13535: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000022] 43681 1727204707.13546: sending task result for task 12b410aa-8751-9e86-7728-000000000022 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204707.13693: no more pending results, returning what we have 43681 1727204707.13698: results queue empty 43681 1727204707.13699: checking for any_errors_fatal 43681 1727204707.13707: done checking for any_errors_fatal 43681 1727204707.13708: checking for max_fail_percentage 43681 1727204707.13710: done checking for max_fail_percentage 43681 1727204707.13711: checking to see if all hosts have failed and the running result is not ok 43681 1727204707.13712: done checking to see if all hosts have failed 43681 1727204707.13713: getting the remaining hosts for this loop 43681 1727204707.13714: done getting the remaining hosts for this loop 43681 1727204707.13721: getting the next task for host managed-node3 43681 1727204707.13728: done getting next task for host managed-node3 43681 1727204707.13733: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204707.13736: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204707.13761: getting variables 43681 1727204707.13763: in VariableManager get_vars() 43681 1727204707.13807: Calling all_inventory to load vars for managed-node3 43681 1727204707.13810: Calling groups_inventory to load vars for managed-node3 43681 1727204707.13813: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204707.13825: Calling all_plugins_play to load vars for managed-node3 43681 1727204707.13828: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204707.13831: Calling groups_plugins_play to load vars for managed-node3 43681 1727204707.14421: done sending task result for task 12b410aa-8751-9e86-7728-000000000022 43681 1727204707.14425: WORKER PROCESS EXITING 43681 1727204707.16237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204707.19396: done with get_vars() 43681 1727204707.19434: done getting variables 43681 1727204707.19509: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.116) 0:00:14.862 ***** 43681 1727204707.19546: entering _queue_task() for managed-node3/service 43681 1727204707.19905: worker is 1 (out of 1 available) 43681 1727204707.19920: exiting _queue_task() for managed-node3/service 43681 1727204707.19933: done queuing things up, now waiting for results queue to drain 43681 1727204707.19935: waiting for pending results... 43681 1727204707.20320: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204707.20417: in run() - task 12b410aa-8751-9e86-7728-000000000023 43681 1727204707.20441: variable 'ansible_search_path' from source: unknown 43681 1727204707.20450: variable 'ansible_search_path' from source: unknown 43681 1727204707.20496: calling self._execute() 43681 1727204707.20608: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204707.20623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204707.20643: variable 'omit' from source: magic vars 43681 1727204707.21099: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.21175: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204707.21335: variable 'network_provider' from source: set_fact 43681 1727204707.21345: variable 'network_state' from source: role '' defaults 43681 1727204707.21359: Evaluated conditional (network_provider == "nm" or network_state != {}): True 43681 1727204707.21370: variable 'omit' from source: magic vars 43681 1727204707.21443: variable 'omit' from source: magic vars 43681 1727204707.21478: variable 'network_service_name' from source: role '' defaults 43681 1727204707.21616: variable 'network_service_name' from source: role '' defaults 43681 1727204707.21722: variable '__network_provider_setup' from source: role '' defaults 43681 1727204707.21738: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204707.21822: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204707.21995: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204707.21999: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204707.22249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204707.24863: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204707.24970: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204707.25025: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204707.25077: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204707.25119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204707.25227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.25274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.25316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.25378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.25405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.25468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.25511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.25550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.25696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.25700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.25958: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204707.26124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.26163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.26198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.26254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.26275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.26394: variable 'ansible_python' from source: facts 43681 1727204707.26426: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204707.26573: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204707.26645: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204707.26821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.26856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.26887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.26943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.26961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.27194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204707.27206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204707.27209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.27212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204707.27214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204707.27359: variable 'network_connections' from source: task vars 43681 1727204707.27373: variable 'interface' from source: set_fact 43681 1727204707.27470: variable 'interface' from source: set_fact 43681 1727204707.27488: variable 'interface' from source: set_fact 43681 1727204707.27583: variable 'interface' from source: set_fact 43681 1727204707.27860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204707.28135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204707.28204: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204707.28258: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204707.28317: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204707.28469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204707.28744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204707.28748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204707.28751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204707.28821: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204707.29308: variable 'network_connections' from source: task vars 43681 1727204707.29322: variable 'interface' from source: set_fact 43681 1727204707.29430: variable 'interface' from source: set_fact 43681 1727204707.29448: variable 'interface' from source: set_fact 43681 1727204707.29554: variable 'interface' from source: set_fact 43681 1727204707.29767: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204707.29897: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204707.30352: variable 'network_connections' from source: task vars 43681 1727204707.30364: variable 'interface' from source: set_fact 43681 1727204707.30459: variable 'interface' from source: set_fact 43681 1727204707.30474: variable 'interface' from source: set_fact 43681 1727204707.30566: variable 'interface' from source: set_fact 43681 1727204707.30645: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204707.30759: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204707.31250: variable 'network_connections' from source: task vars 43681 1727204707.31254: variable 'interface' from source: set_fact 43681 1727204707.31306: variable 'interface' from source: set_fact 43681 1727204707.31349: variable 'interface' from source: set_fact 43681 1727204707.31683: variable 'interface' from source: set_fact 43681 1727204707.31749: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204707.31835: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204707.31849: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204707.31934: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204707.32595: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204707.34032: variable 'network_connections' from source: task vars 43681 1727204707.34152: variable 'interface' from source: set_fact 43681 1727204707.34370: variable 'interface' from source: set_fact 43681 1727204707.34374: variable 'interface' from source: set_fact 43681 1727204707.34423: variable 'interface' from source: set_fact 43681 1727204707.34474: variable 'ansible_distribution' from source: facts 43681 1727204707.34491: variable '__network_rh_distros' from source: role '' defaults 43681 1727204707.34504: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.34541: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204707.34780: variable 'ansible_distribution' from source: facts 43681 1727204707.34796: variable '__network_rh_distros' from source: role '' defaults 43681 1727204707.34814: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.34830: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204707.35069: variable 'ansible_distribution' from source: facts 43681 1727204707.35080: variable '__network_rh_distros' from source: role '' defaults 43681 1727204707.35095: variable 'ansible_distribution_major_version' from source: facts 43681 1727204707.35149: variable 'network_provider' from source: set_fact 43681 1727204707.35183: variable 'omit' from source: magic vars 43681 1727204707.35225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204707.35272: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204707.35352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204707.35355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204707.35358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204707.35387: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204707.35399: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204707.35410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204707.35551: Set connection var ansible_shell_type to sh 43681 1727204707.35573: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204707.35675: Set connection var ansible_timeout to 10 43681 1727204707.35683: Set connection var ansible_pipelining to False 43681 1727204707.35686: Set connection var ansible_connection to ssh 43681 1727204707.35691: Set connection var ansible_shell_executable to /bin/sh 43681 1727204707.35693: variable 'ansible_shell_executable' from source: unknown 43681 1727204707.35696: variable 'ansible_connection' from source: unknown 43681 1727204707.35698: variable 'ansible_module_compression' from source: unknown 43681 1727204707.35700: variable 'ansible_shell_type' from source: unknown 43681 1727204707.35703: variable 'ansible_shell_executable' from source: unknown 43681 1727204707.35705: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204707.35717: variable 'ansible_pipelining' from source: unknown 43681 1727204707.35728: variable 'ansible_timeout' from source: unknown 43681 1727204707.35739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204707.35883: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204707.35910: variable 'omit' from source: magic vars 43681 1727204707.35922: starting attempt loop 43681 1727204707.35929: running the handler 43681 1727204707.36047: variable 'ansible_facts' from source: unknown 43681 1727204707.37397: _low_level_execute_command(): starting 43681 1727204707.37416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204707.38326: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.38332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204707.38334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204707.38353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204707.38403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204707.40197: stdout chunk (state=3): >>>/root <<< 43681 1727204707.40430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204707.40435: stdout chunk (state=3): >>><<< 43681 1727204707.40437: stderr chunk (state=3): >>><<< 43681 1727204707.40464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204707.40596: _low_level_execute_command(): starting 43681 1727204707.40602: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165 `" && echo ansible-tmp-1727204707.4047287-44220-135527296278165="` echo /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165 `" ) && sleep 0' 43681 1727204707.41173: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204707.41269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204707.41301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204707.41388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204707.43425: stdout chunk (state=3): >>>ansible-tmp-1727204707.4047287-44220-135527296278165=/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165 <<< 43681 1727204707.43649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204707.43653: stdout chunk (state=3): >>><<< 43681 1727204707.43656: stderr chunk (state=3): >>><<< 43681 1727204707.43796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204707.4047287-44220-135527296278165=/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204707.43799: variable 'ansible_module_compression' from source: unknown 43681 1727204707.43803: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 43681 1727204707.43805: ANSIBALLZ: Acquiring lock 43681 1727204707.43807: ANSIBALLZ: Lock acquired: 140156138759584 43681 1727204707.43809: ANSIBALLZ: Creating module 43681 1727204707.86397: ANSIBALLZ: Writing module into payload 43681 1727204707.86551: ANSIBALLZ: Writing module 43681 1727204707.86577: ANSIBALLZ: Renaming module 43681 1727204707.86584: ANSIBALLZ: Done creating module 43681 1727204707.86627: variable 'ansible_facts' from source: unknown 43681 1727204707.86773: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py 43681 1727204707.86904: Sending initial data 43681 1727204707.86911: Sent initial data (156 bytes) 43681 1727204707.87379: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204707.87383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.87386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204707.87388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.87444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204707.87448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204707.87500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204707.89239: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 43681 1727204707.89266: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204707.89294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204707.89368: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmprq2y1eir /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py <<< 43681 1727204707.89372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py" <<< 43681 1727204707.89413: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmprq2y1eir" to remote "/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py" <<< 43681 1727204707.91215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204707.91304: stderr chunk (state=3): >>><<< 43681 1727204707.91307: stdout chunk (state=3): >>><<< 43681 1727204707.91344: done transferring module to remote 43681 1727204707.91359: _low_level_execute_command(): starting 43681 1727204707.91370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/ /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py && sleep 0' 43681 1727204707.92124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.92145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204707.92158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204707.92176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204707.92250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204707.94127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204707.94172: stderr chunk (state=3): >>><<< 43681 1727204707.94175: stdout chunk (state=3): >>><<< 43681 1727204707.94190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204707.94199: _low_level_execute_command(): starting 43681 1727204707.94202: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/AnsiballZ_systemd.py && sleep 0' 43681 1727204707.94662: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204707.94666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.94669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204707.94672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204707.94720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204707.94726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204707.94775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204708.27812: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11902976", "MemoryAvailable": "infinity", "CPUUsageNSec": "2000471000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 43681 1727204708.27830: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 43681 1727204708.27840: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 43681 1727204708.29926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204708.29992: stderr chunk (state=3): >>><<< 43681 1727204708.29996: stdout chunk (state=3): >>><<< 43681 1727204708.30015: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11902976", "MemoryAvailable": "infinity", "CPUUsageNSec": "2000471000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204708.30321: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204708.30324: _low_level_execute_command(): starting 43681 1727204708.30327: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204707.4047287-44220-135527296278165/ > /dev/null 2>&1 && sleep 0' 43681 1727204708.30907: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204708.30928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204708.30931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204708.30946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204708.30959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204708.30967: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204708.30977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204708.30995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204708.31004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204708.31013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204708.31036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204708.31040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204708.31099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204708.31103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204708.31105: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204708.31107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204708.31149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204708.31170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204708.31180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204708.31295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204708.33236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204708.33289: stderr chunk (state=3): >>><<< 43681 1727204708.33295: stdout chunk (state=3): >>><<< 43681 1727204708.33311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204708.33321: handler run complete 43681 1727204708.33435: attempt loop complete, returning result 43681 1727204708.33438: _execute() done 43681 1727204708.33441: dumping result to json 43681 1727204708.33444: done dumping result, returning 43681 1727204708.33446: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9e86-7728-000000000023] 43681 1727204708.33451: sending task result for task 12b410aa-8751-9e86-7728-000000000023 43681 1727204708.34985: done sending task result for task 12b410aa-8751-9e86-7728-000000000023 43681 1727204708.35092: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204708.35145: no more pending results, returning what we have 43681 1727204708.35149: results queue empty 43681 1727204708.35150: checking for any_errors_fatal 43681 1727204708.35157: done checking for any_errors_fatal 43681 1727204708.35158: checking for max_fail_percentage 43681 1727204708.35160: done checking for max_fail_percentage 43681 1727204708.35162: checking to see if all hosts have failed and the running result is not ok 43681 1727204708.35163: done checking to see if all hosts have failed 43681 1727204708.35164: getting the remaining hosts for this loop 43681 1727204708.35165: done getting the remaining hosts for this loop 43681 1727204708.35169: getting the next task for host managed-node3 43681 1727204708.35175: done getting next task for host managed-node3 43681 1727204708.35180: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204708.35183: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204708.35196: getting variables 43681 1727204708.35198: in VariableManager get_vars() 43681 1727204708.35235: Calling all_inventory to load vars for managed-node3 43681 1727204708.35238: Calling groups_inventory to load vars for managed-node3 43681 1727204708.35241: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204708.35252: Calling all_plugins_play to load vars for managed-node3 43681 1727204708.35255: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204708.35259: Calling groups_plugins_play to load vars for managed-node3 43681 1727204708.37570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204708.43640: done with get_vars() 43681 1727204708.43688: done getting variables 43681 1727204708.43840: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:08 -0400 (0:00:01.245) 0:00:16.107 ***** 43681 1727204708.44116: entering _queue_task() for managed-node3/service 43681 1727204708.44832: worker is 1 (out of 1 available) 43681 1727204708.44963: exiting _queue_task() for managed-node3/service 43681 1727204708.44978: done queuing things up, now waiting for results queue to drain 43681 1727204708.44980: waiting for pending results... 43681 1727204708.45609: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204708.45995: in run() - task 12b410aa-8751-9e86-7728-000000000024 43681 1727204708.46000: variable 'ansible_search_path' from source: unknown 43681 1727204708.46003: variable 'ansible_search_path' from source: unknown 43681 1727204708.46006: calling self._execute() 43681 1727204708.46022: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.46035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.46051: variable 'omit' from source: magic vars 43681 1727204708.46487: variable 'ansible_distribution_major_version' from source: facts 43681 1727204708.46511: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204708.46677: variable 'network_provider' from source: set_fact 43681 1727204708.46691: Evaluated conditional (network_provider == "nm"): True 43681 1727204708.46823: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204708.46942: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204708.47182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204708.50118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204708.50205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204708.50260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204708.50308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204708.50351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204708.50457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204708.50500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204708.50545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204708.50606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204708.50634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204708.50797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204708.50800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204708.50803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204708.50826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204708.50850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204708.50907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204708.50939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204708.50968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204708.51023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204708.51042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204708.51229: variable 'network_connections' from source: task vars 43681 1727204708.51253: variable 'interface' from source: set_fact 43681 1727204708.51359: variable 'interface' from source: set_fact 43681 1727204708.51375: variable 'interface' from source: set_fact 43681 1727204708.51467: variable 'interface' from source: set_fact 43681 1727204708.51678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204708.51935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204708.51996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204708.52107: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204708.52111: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204708.52142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204708.52174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204708.52219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204708.52259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204708.52333: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204708.52709: variable 'network_connections' from source: task vars 43681 1727204708.52724: variable 'interface' from source: set_fact 43681 1727204708.52811: variable 'interface' from source: set_fact 43681 1727204708.52828: variable 'interface' from source: set_fact 43681 1727204708.52975: variable 'interface' from source: set_fact 43681 1727204708.53028: Evaluated conditional (__network_wpa_supplicant_required): False 43681 1727204708.53038: when evaluation is False, skipping this task 43681 1727204708.53047: _execute() done 43681 1727204708.53090: dumping result to json 43681 1727204708.53094: done dumping result, returning 43681 1727204708.53099: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9e86-7728-000000000024] 43681 1727204708.53111: sending task result for task 12b410aa-8751-9e86-7728-000000000024 43681 1727204708.53273: done sending task result for task 12b410aa-8751-9e86-7728-000000000024 43681 1727204708.53277: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 43681 1727204708.53359: no more pending results, returning what we have 43681 1727204708.53365: results queue empty 43681 1727204708.53366: checking for any_errors_fatal 43681 1727204708.53395: done checking for any_errors_fatal 43681 1727204708.53396: checking for max_fail_percentage 43681 1727204708.53399: done checking for max_fail_percentage 43681 1727204708.53399: checking to see if all hosts have failed and the running result is not ok 43681 1727204708.53400: done checking to see if all hosts have failed 43681 1727204708.53402: getting the remaining hosts for this loop 43681 1727204708.53403: done getting the remaining hosts for this loop 43681 1727204708.53409: getting the next task for host managed-node3 43681 1727204708.53420: done getting next task for host managed-node3 43681 1727204708.53427: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204708.53430: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204708.53449: getting variables 43681 1727204708.53452: in VariableManager get_vars() 43681 1727204708.53706: Calling all_inventory to load vars for managed-node3 43681 1727204708.53710: Calling groups_inventory to load vars for managed-node3 43681 1727204708.53713: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204708.53727: Calling all_plugins_play to load vars for managed-node3 43681 1727204708.53731: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204708.53736: Calling groups_plugins_play to load vars for managed-node3 43681 1727204708.56462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204708.59538: done with get_vars() 43681 1727204708.59574: done getting variables 43681 1727204708.59649: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.155) 0:00:16.263 ***** 43681 1727204708.59685: entering _queue_task() for managed-node3/service 43681 1727204708.60041: worker is 1 (out of 1 available) 43681 1727204708.60055: exiting _queue_task() for managed-node3/service 43681 1727204708.60070: done queuing things up, now waiting for results queue to drain 43681 1727204708.60072: waiting for pending results... 43681 1727204708.60511: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204708.60557: in run() - task 12b410aa-8751-9e86-7728-000000000025 43681 1727204708.60580: variable 'ansible_search_path' from source: unknown 43681 1727204708.60591: variable 'ansible_search_path' from source: unknown 43681 1727204708.60696: calling self._execute() 43681 1727204708.60754: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.60768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.60784: variable 'omit' from source: magic vars 43681 1727204708.61245: variable 'ansible_distribution_major_version' from source: facts 43681 1727204708.61269: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204708.61429: variable 'network_provider' from source: set_fact 43681 1727204708.61444: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204708.61452: when evaluation is False, skipping this task 43681 1727204708.61460: _execute() done 43681 1727204708.61473: dumping result to json 43681 1727204708.61698: done dumping result, returning 43681 1727204708.61703: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9e86-7728-000000000025] 43681 1727204708.61706: sending task result for task 12b410aa-8751-9e86-7728-000000000025 43681 1727204708.61786: done sending task result for task 12b410aa-8751-9e86-7728-000000000025 43681 1727204708.61792: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204708.61843: no more pending results, returning what we have 43681 1727204708.61848: results queue empty 43681 1727204708.61849: checking for any_errors_fatal 43681 1727204708.61856: done checking for any_errors_fatal 43681 1727204708.61857: checking for max_fail_percentage 43681 1727204708.61860: done checking for max_fail_percentage 43681 1727204708.61861: checking to see if all hosts have failed and the running result is not ok 43681 1727204708.61862: done checking to see if all hosts have failed 43681 1727204708.61863: getting the remaining hosts for this loop 43681 1727204708.61865: done getting the remaining hosts for this loop 43681 1727204708.61870: getting the next task for host managed-node3 43681 1727204708.61878: done getting next task for host managed-node3 43681 1727204708.61882: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204708.61886: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204708.61906: getting variables 43681 1727204708.61909: in VariableManager get_vars() 43681 1727204708.61959: Calling all_inventory to load vars for managed-node3 43681 1727204708.61962: Calling groups_inventory to load vars for managed-node3 43681 1727204708.61965: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204708.61979: Calling all_plugins_play to load vars for managed-node3 43681 1727204708.61983: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204708.61987: Calling groups_plugins_play to load vars for managed-node3 43681 1727204708.64721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204708.67876: done with get_vars() 43681 1727204708.67925: done getting variables 43681 1727204708.68002: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.083) 0:00:16.347 ***** 43681 1727204708.68046: entering _queue_task() for managed-node3/copy 43681 1727204708.68421: worker is 1 (out of 1 available) 43681 1727204708.68435: exiting _queue_task() for managed-node3/copy 43681 1727204708.68450: done queuing things up, now waiting for results queue to drain 43681 1727204708.68451: waiting for pending results... 43681 1727204708.68769: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204708.69095: in run() - task 12b410aa-8751-9e86-7728-000000000026 43681 1727204708.69100: variable 'ansible_search_path' from source: unknown 43681 1727204708.69102: variable 'ansible_search_path' from source: unknown 43681 1727204708.69105: calling self._execute() 43681 1727204708.69124: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.69137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.69153: variable 'omit' from source: magic vars 43681 1727204708.69613: variable 'ansible_distribution_major_version' from source: facts 43681 1727204708.69635: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204708.69800: variable 'network_provider' from source: set_fact 43681 1727204708.69812: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204708.69823: when evaluation is False, skipping this task 43681 1727204708.69832: _execute() done 43681 1727204708.69840: dumping result to json 43681 1727204708.69847: done dumping result, returning 43681 1727204708.69860: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9e86-7728-000000000026] 43681 1727204708.69876: sending task result for task 12b410aa-8751-9e86-7728-000000000026 43681 1727204708.70215: done sending task result for task 12b410aa-8751-9e86-7728-000000000026 43681 1727204708.70221: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 43681 1727204708.70265: no more pending results, returning what we have 43681 1727204708.70269: results queue empty 43681 1727204708.70270: checking for any_errors_fatal 43681 1727204708.70275: done checking for any_errors_fatal 43681 1727204708.70276: checking for max_fail_percentage 43681 1727204708.70278: done checking for max_fail_percentage 43681 1727204708.70280: checking to see if all hosts have failed and the running result is not ok 43681 1727204708.70281: done checking to see if all hosts have failed 43681 1727204708.70282: getting the remaining hosts for this loop 43681 1727204708.70283: done getting the remaining hosts for this loop 43681 1727204708.70287: getting the next task for host managed-node3 43681 1727204708.70294: done getting next task for host managed-node3 43681 1727204708.70298: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204708.70301: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204708.70320: getting variables 43681 1727204708.70322: in VariableManager get_vars() 43681 1727204708.70361: Calling all_inventory to load vars for managed-node3 43681 1727204708.70364: Calling groups_inventory to load vars for managed-node3 43681 1727204708.70367: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204708.70378: Calling all_plugins_play to load vars for managed-node3 43681 1727204708.70381: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204708.70385: Calling groups_plugins_play to load vars for managed-node3 43681 1727204708.72668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204708.75857: done with get_vars() 43681 1727204708.75900: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.079) 0:00:16.426 ***** 43681 1727204708.76008: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204708.76010: Creating lock for fedora.linux_system_roles.network_connections 43681 1727204708.76388: worker is 1 (out of 1 available) 43681 1727204708.76604: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204708.76618: done queuing things up, now waiting for results queue to drain 43681 1727204708.76620: waiting for pending results... 43681 1727204708.76736: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204708.76907: in run() - task 12b410aa-8751-9e86-7728-000000000027 43681 1727204708.76932: variable 'ansible_search_path' from source: unknown 43681 1727204708.76962: variable 'ansible_search_path' from source: unknown 43681 1727204708.76994: calling self._execute() 43681 1727204708.77103: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.77180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.77184: variable 'omit' from source: magic vars 43681 1727204708.77596: variable 'ansible_distribution_major_version' from source: facts 43681 1727204708.77623: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204708.77635: variable 'omit' from source: magic vars 43681 1727204708.77711: variable 'omit' from source: magic vars 43681 1727204708.77923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204708.80834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204708.80919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204708.80963: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204708.81010: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204708.81295: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204708.81299: variable 'network_provider' from source: set_fact 43681 1727204708.81326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204708.81368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204708.81412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204708.81503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204708.81535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204708.81745: variable 'omit' from source: magic vars 43681 1727204708.81839: variable 'omit' from source: magic vars 43681 1727204708.81983: variable 'network_connections' from source: task vars 43681 1727204708.82005: variable 'interface' from source: set_fact 43681 1727204708.82099: variable 'interface' from source: set_fact 43681 1727204708.82113: variable 'interface' from source: set_fact 43681 1727204708.82198: variable 'interface' from source: set_fact 43681 1727204708.82759: variable 'omit' from source: magic vars 43681 1727204708.82774: variable '__lsr_ansible_managed' from source: task vars 43681 1727204708.82860: variable '__lsr_ansible_managed' from source: task vars 43681 1727204708.83379: Loaded config def from plugin (lookup/template) 43681 1727204708.83383: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 43681 1727204708.83385: File lookup term: get_ansible_managed.j2 43681 1727204708.83388: variable 'ansible_search_path' from source: unknown 43681 1727204708.83393: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 43681 1727204708.83397: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 43681 1727204708.83400: variable 'ansible_search_path' from source: unknown 43681 1727204708.90830: variable 'ansible_managed' from source: unknown 43681 1727204708.91056: variable 'omit' from source: magic vars 43681 1727204708.91096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204708.91139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204708.91169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204708.91205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204708.91227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204708.91267: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204708.91278: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.91287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.91418: Set connection var ansible_shell_type to sh 43681 1727204708.91435: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204708.91449: Set connection var ansible_timeout to 10 43681 1727204708.91595: Set connection var ansible_pipelining to False 43681 1727204708.91599: Set connection var ansible_connection to ssh 43681 1727204708.91601: Set connection var ansible_shell_executable to /bin/sh 43681 1727204708.91604: variable 'ansible_shell_executable' from source: unknown 43681 1727204708.91605: variable 'ansible_connection' from source: unknown 43681 1727204708.91608: variable 'ansible_module_compression' from source: unknown 43681 1727204708.91610: variable 'ansible_shell_type' from source: unknown 43681 1727204708.91612: variable 'ansible_shell_executable' from source: unknown 43681 1727204708.91614: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204708.91619: variable 'ansible_pipelining' from source: unknown 43681 1727204708.91621: variable 'ansible_timeout' from source: unknown 43681 1727204708.91623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204708.91746: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204708.91772: variable 'omit' from source: magic vars 43681 1727204708.91794: starting attempt loop 43681 1727204708.91803: running the handler 43681 1727204708.91826: _low_level_execute_command(): starting 43681 1727204708.91838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204708.92467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204708.92485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204708.92499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204708.92552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204708.92571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204708.92616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204708.94382: stdout chunk (state=3): >>>/root <<< 43681 1727204708.94596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204708.94623: stdout chunk (state=3): >>><<< 43681 1727204708.94626: stderr chunk (state=3): >>><<< 43681 1727204708.94653: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204708.94694: _low_level_execute_command(): starting 43681 1727204708.94704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612 `" && echo ansible-tmp-1727204708.946636-44260-67532933606612="` echo /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612 `" ) && sleep 0' 43681 1727204708.95288: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204708.95395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204708.95398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204708.95409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204708.95480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204708.95483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204708.95510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204708.95583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204708.97569: stdout chunk (state=3): >>>ansible-tmp-1727204708.946636-44260-67532933606612=/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612 <<< 43681 1727204708.97686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204708.97736: stderr chunk (state=3): >>><<< 43681 1727204708.97740: stdout chunk (state=3): >>><<< 43681 1727204708.97756: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204708.946636-44260-67532933606612=/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204708.97799: variable 'ansible_module_compression' from source: unknown 43681 1727204708.97847: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 43681 1727204708.97851: ANSIBALLZ: Acquiring lock 43681 1727204708.97854: ANSIBALLZ: Lock acquired: 140156135709184 43681 1727204708.97857: ANSIBALLZ: Creating module 43681 1727204709.15436: ANSIBALLZ: Writing module into payload 43681 1727204709.15773: ANSIBALLZ: Writing module 43681 1727204709.15796: ANSIBALLZ: Renaming module 43681 1727204709.15802: ANSIBALLZ: Done creating module 43681 1727204709.15827: variable 'ansible_facts' from source: unknown 43681 1727204709.15894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py 43681 1727204709.16019: Sending initial data 43681 1727204709.16023: Sent initial data (166 bytes) 43681 1727204709.16548: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.16552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204709.16558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.16560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.16563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204709.16565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.16616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204709.16620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.16671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.18403: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 43681 1727204709.18417: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204709.18481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204709.18538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp5nwqrr2n /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py <<< 43681 1727204709.18543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py" <<< 43681 1727204709.18585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp5nwqrr2n" to remote "/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py" <<< 43681 1727204709.19781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204709.19855: stderr chunk (state=3): >>><<< 43681 1727204709.19859: stdout chunk (state=3): >>><<< 43681 1727204709.19885: done transferring module to remote 43681 1727204709.19899: _low_level_execute_command(): starting 43681 1727204709.19904: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/ /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py && sleep 0' 43681 1727204709.20377: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.20381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204709.20384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.20386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204709.20388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.20448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204709.20455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.20485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.22597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204709.22602: stdout chunk (state=3): >>><<< 43681 1727204709.22605: stderr chunk (state=3): >>><<< 43681 1727204709.22607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204709.22610: _low_level_execute_command(): starting 43681 1727204709.22612: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/AnsiballZ_network_connections.py && sleep 0' 43681 1727204709.23169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204709.23179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.23194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204709.23210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204709.23226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204709.23234: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204709.23245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.23274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204709.23278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204709.23281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204709.23283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.23384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204709.23398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204709.23401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204709.23404: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204709.23406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.23408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204709.23423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204709.23433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.23518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.58401: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0"<<< 43681 1727204709.58568: stdout chunk (state=3): >>>, "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 43681 1727204709.60896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204709.60966: stderr chunk (state=3): >>><<< 43681 1727204709.60970: stdout chunk (state=3): >>><<< 43681 1727204709.60991: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204709.61123: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204709.61133: _low_level_execute_command(): starting 43681 1727204709.61139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204708.946636-44260-67532933606612/ > /dev/null 2>&1 && sleep 0' 43681 1727204709.61641: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.61644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204709.61647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.61649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.61651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204709.61653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.61696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204709.61720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.61752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.67003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204709.67061: stderr chunk (state=3): >>><<< 43681 1727204709.67067: stdout chunk (state=3): >>><<< 43681 1727204709.67087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204709.67096: handler run complete 43681 1727204709.67212: attempt loop complete, returning result 43681 1727204709.67218: _execute() done 43681 1727204709.67222: dumping result to json 43681 1727204709.67233: done dumping result, returning 43681 1727204709.67243: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9e86-7728-000000000027] 43681 1727204709.67249: sending task result for task 12b410aa-8751-9e86-7728-000000000027 43681 1727204709.67426: done sending task result for task 12b410aa-8751-9e86-7728-000000000027 43681 1727204709.67429: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active) 43681 1727204709.67847: no more pending results, returning what we have 43681 1727204709.67850: results queue empty 43681 1727204709.67852: checking for any_errors_fatal 43681 1727204709.67858: done checking for any_errors_fatal 43681 1727204709.67859: checking for max_fail_percentage 43681 1727204709.67861: done checking for max_fail_percentage 43681 1727204709.67862: checking to see if all hosts have failed and the running result is not ok 43681 1727204709.67867: done checking to see if all hosts have failed 43681 1727204709.67868: getting the remaining hosts for this loop 43681 1727204709.67870: done getting the remaining hosts for this loop 43681 1727204709.67874: getting the next task for host managed-node3 43681 1727204709.67880: done getting next task for host managed-node3 43681 1727204709.67883: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204709.67885: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204709.67896: getting variables 43681 1727204709.67898: in VariableManager get_vars() 43681 1727204709.67932: Calling all_inventory to load vars for managed-node3 43681 1727204709.67934: Calling groups_inventory to load vars for managed-node3 43681 1727204709.67936: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204709.67944: Calling all_plugins_play to load vars for managed-node3 43681 1727204709.67946: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204709.67949: Calling groups_plugins_play to load vars for managed-node3 43681 1727204709.69270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204709.70858: done with get_vars() 43681 1727204709.70882: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.949) 0:00:17.376 ***** 43681 1727204709.70961: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204709.70962: Creating lock for fedora.linux_system_roles.network_state 43681 1727204709.71242: worker is 1 (out of 1 available) 43681 1727204709.71260: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204709.71275: done queuing things up, now waiting for results queue to drain 43681 1727204709.71277: waiting for pending results... 43681 1727204709.71474: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204709.71579: in run() - task 12b410aa-8751-9e86-7728-000000000028 43681 1727204709.71593: variable 'ansible_search_path' from source: unknown 43681 1727204709.71597: variable 'ansible_search_path' from source: unknown 43681 1727204709.71633: calling self._execute() 43681 1727204709.71710: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.71721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.71730: variable 'omit' from source: magic vars 43681 1727204709.72056: variable 'ansible_distribution_major_version' from source: facts 43681 1727204709.72067: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204709.72179: variable 'network_state' from source: role '' defaults 43681 1727204709.72190: Evaluated conditional (network_state != {}): False 43681 1727204709.72197: when evaluation is False, skipping this task 43681 1727204709.72200: _execute() done 43681 1727204709.72205: dumping result to json 43681 1727204709.72209: done dumping result, returning 43681 1727204709.72217: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9e86-7728-000000000028] 43681 1727204709.72226: sending task result for task 12b410aa-8751-9e86-7728-000000000028 43681 1727204709.72320: done sending task result for task 12b410aa-8751-9e86-7728-000000000028 43681 1727204709.72324: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204709.72377: no more pending results, returning what we have 43681 1727204709.72382: results queue empty 43681 1727204709.72383: checking for any_errors_fatal 43681 1727204709.72410: done checking for any_errors_fatal 43681 1727204709.72411: checking for max_fail_percentage 43681 1727204709.72414: done checking for max_fail_percentage 43681 1727204709.72414: checking to see if all hosts have failed and the running result is not ok 43681 1727204709.72415: done checking to see if all hosts have failed 43681 1727204709.72416: getting the remaining hosts for this loop 43681 1727204709.72418: done getting the remaining hosts for this loop 43681 1727204709.72422: getting the next task for host managed-node3 43681 1727204709.72429: done getting next task for host managed-node3 43681 1727204709.72433: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204709.72436: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204709.72452: getting variables 43681 1727204709.72454: in VariableManager get_vars() 43681 1727204709.72488: Calling all_inventory to load vars for managed-node3 43681 1727204709.72498: Calling groups_inventory to load vars for managed-node3 43681 1727204709.72501: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204709.72511: Calling all_plugins_play to load vars for managed-node3 43681 1727204709.72514: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204709.72517: Calling groups_plugins_play to load vars for managed-node3 43681 1727204709.73806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204709.75401: done with get_vars() 43681 1727204709.75426: done getting variables 43681 1727204709.75478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.045) 0:00:17.421 ***** 43681 1727204709.75509: entering _queue_task() for managed-node3/debug 43681 1727204709.75765: worker is 1 (out of 1 available) 43681 1727204709.75780: exiting _queue_task() for managed-node3/debug 43681 1727204709.75797: done queuing things up, now waiting for results queue to drain 43681 1727204709.75799: waiting for pending results... 43681 1727204709.75999: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204709.76110: in run() - task 12b410aa-8751-9e86-7728-000000000029 43681 1727204709.76126: variable 'ansible_search_path' from source: unknown 43681 1727204709.76131: variable 'ansible_search_path' from source: unknown 43681 1727204709.76166: calling self._execute() 43681 1727204709.76244: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.76250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.76263: variable 'omit' from source: magic vars 43681 1727204709.76595: variable 'ansible_distribution_major_version' from source: facts 43681 1727204709.76601: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204709.76608: variable 'omit' from source: magic vars 43681 1727204709.76652: variable 'omit' from source: magic vars 43681 1727204709.76682: variable 'omit' from source: magic vars 43681 1727204709.76725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204709.76758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204709.76775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204709.76793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.76812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.76837: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204709.76840: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.76844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.76932: Set connection var ansible_shell_type to sh 43681 1727204709.76939: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204709.76946: Set connection var ansible_timeout to 10 43681 1727204709.76954: Set connection var ansible_pipelining to False 43681 1727204709.76961: Set connection var ansible_connection to ssh 43681 1727204709.76967: Set connection var ansible_shell_executable to /bin/sh 43681 1727204709.76986: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.76991: variable 'ansible_connection' from source: unknown 43681 1727204709.76994: variable 'ansible_module_compression' from source: unknown 43681 1727204709.76999: variable 'ansible_shell_type' from source: unknown 43681 1727204709.77002: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.77006: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.77012: variable 'ansible_pipelining' from source: unknown 43681 1727204709.77015: variable 'ansible_timeout' from source: unknown 43681 1727204709.77029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.77145: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204709.77155: variable 'omit' from source: magic vars 43681 1727204709.77162: starting attempt loop 43681 1727204709.77165: running the handler 43681 1727204709.77276: variable '__network_connections_result' from source: set_fact 43681 1727204709.77342: handler run complete 43681 1727204709.77363: attempt loop complete, returning result 43681 1727204709.77367: _execute() done 43681 1727204709.77370: dumping result to json 43681 1727204709.77373: done dumping result, returning 43681 1727204709.77383: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9e86-7728-000000000029] 43681 1727204709.77388: sending task result for task 12b410aa-8751-9e86-7728-000000000029 43681 1727204709.77479: done sending task result for task 12b410aa-8751-9e86-7728-000000000029 43681 1727204709.77482: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active)" ] } 43681 1727204709.77559: no more pending results, returning what we have 43681 1727204709.77563: results queue empty 43681 1727204709.77564: checking for any_errors_fatal 43681 1727204709.77572: done checking for any_errors_fatal 43681 1727204709.77573: checking for max_fail_percentage 43681 1727204709.77575: done checking for max_fail_percentage 43681 1727204709.77575: checking to see if all hosts have failed and the running result is not ok 43681 1727204709.77576: done checking to see if all hosts have failed 43681 1727204709.77577: getting the remaining hosts for this loop 43681 1727204709.77579: done getting the remaining hosts for this loop 43681 1727204709.77583: getting the next task for host managed-node3 43681 1727204709.77592: done getting next task for host managed-node3 43681 1727204709.77596: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204709.77599: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204709.77611: getting variables 43681 1727204709.77613: in VariableManager get_vars() 43681 1727204709.77648: Calling all_inventory to load vars for managed-node3 43681 1727204709.77651: Calling groups_inventory to load vars for managed-node3 43681 1727204709.77654: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204709.77663: Calling all_plugins_play to load vars for managed-node3 43681 1727204709.77666: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204709.77670: Calling groups_plugins_play to load vars for managed-node3 43681 1727204709.78890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204709.80555: done with get_vars() 43681 1727204709.80576: done getting variables 43681 1727204709.80629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.051) 0:00:17.473 ***** 43681 1727204709.80657: entering _queue_task() for managed-node3/debug 43681 1727204709.80913: worker is 1 (out of 1 available) 43681 1727204709.80928: exiting _queue_task() for managed-node3/debug 43681 1727204709.80942: done queuing things up, now waiting for results queue to drain 43681 1727204709.80944: waiting for pending results... 43681 1727204709.81149: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204709.81256: in run() - task 12b410aa-8751-9e86-7728-00000000002a 43681 1727204709.81270: variable 'ansible_search_path' from source: unknown 43681 1727204709.81275: variable 'ansible_search_path' from source: unknown 43681 1727204709.81310: calling self._execute() 43681 1727204709.81386: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.81391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.81406: variable 'omit' from source: magic vars 43681 1727204709.81732: variable 'ansible_distribution_major_version' from source: facts 43681 1727204709.81744: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204709.81750: variable 'omit' from source: magic vars 43681 1727204709.81801: variable 'omit' from source: magic vars 43681 1727204709.81837: variable 'omit' from source: magic vars 43681 1727204709.81872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204709.81905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204709.81926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204709.81945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.81957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.81985: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204709.81991: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.81994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.82078: Set connection var ansible_shell_type to sh 43681 1727204709.82084: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204709.82092: Set connection var ansible_timeout to 10 43681 1727204709.82102: Set connection var ansible_pipelining to False 43681 1727204709.82108: Set connection var ansible_connection to ssh 43681 1727204709.82114: Set connection var ansible_shell_executable to /bin/sh 43681 1727204709.82136: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.82139: variable 'ansible_connection' from source: unknown 43681 1727204709.82143: variable 'ansible_module_compression' from source: unknown 43681 1727204709.82145: variable 'ansible_shell_type' from source: unknown 43681 1727204709.82150: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.82153: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.82160: variable 'ansible_pipelining' from source: unknown 43681 1727204709.82168: variable 'ansible_timeout' from source: unknown 43681 1727204709.82195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.82301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204709.82311: variable 'omit' from source: magic vars 43681 1727204709.82320: starting attempt loop 43681 1727204709.82323: running the handler 43681 1727204709.82363: variable '__network_connections_result' from source: set_fact 43681 1727204709.82433: variable '__network_connections_result' from source: set_fact 43681 1727204709.82753: handler run complete 43681 1727204709.82823: attempt loop complete, returning result 43681 1727204709.82826: _execute() done 43681 1727204709.82829: dumping result to json 43681 1727204709.82839: done dumping result, returning 43681 1727204709.82847: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9e86-7728-00000000002a] 43681 1727204709.82853: sending task result for task 12b410aa-8751-9e86-7728-00000000002a 43681 1727204709.82978: done sending task result for task 12b410aa-8751-9e86-7728-00000000002a 43681 1727204709.82981: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4ec82e11-94f9-4a67-a45f-0542deb3f3a9 (not-active)" ] } } 43681 1727204709.83206: no more pending results, returning what we have 43681 1727204709.83209: results queue empty 43681 1727204709.83210: checking for any_errors_fatal 43681 1727204709.83214: done checking for any_errors_fatal 43681 1727204709.83215: checking for max_fail_percentage 43681 1727204709.83218: done checking for max_fail_percentage 43681 1727204709.83219: checking to see if all hosts have failed and the running result is not ok 43681 1727204709.83220: done checking to see if all hosts have failed 43681 1727204709.83220: getting the remaining hosts for this loop 43681 1727204709.83221: done getting the remaining hosts for this loop 43681 1727204709.83224: getting the next task for host managed-node3 43681 1727204709.83227: done getting next task for host managed-node3 43681 1727204709.83230: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204709.83232: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204709.83240: getting variables 43681 1727204709.83241: in VariableManager get_vars() 43681 1727204709.83267: Calling all_inventory to load vars for managed-node3 43681 1727204709.83269: Calling groups_inventory to load vars for managed-node3 43681 1727204709.83270: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204709.83277: Calling all_plugins_play to load vars for managed-node3 43681 1727204709.83279: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204709.83282: Calling groups_plugins_play to load vars for managed-node3 43681 1727204709.84477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204709.86082: done with get_vars() 43681 1727204709.86109: done getting variables 43681 1727204709.86164: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.055) 0:00:17.528 ***** 43681 1727204709.86194: entering _queue_task() for managed-node3/debug 43681 1727204709.86460: worker is 1 (out of 1 available) 43681 1727204709.86477: exiting _queue_task() for managed-node3/debug 43681 1727204709.86492: done queuing things up, now waiting for results queue to drain 43681 1727204709.86494: waiting for pending results... 43681 1727204709.86696: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204709.86809: in run() - task 12b410aa-8751-9e86-7728-00000000002b 43681 1727204709.86826: variable 'ansible_search_path' from source: unknown 43681 1727204709.86831: variable 'ansible_search_path' from source: unknown 43681 1727204709.86860: calling self._execute() 43681 1727204709.86937: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.86949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.86957: variable 'omit' from source: magic vars 43681 1727204709.87276: variable 'ansible_distribution_major_version' from source: facts 43681 1727204709.87287: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204709.87396: variable 'network_state' from source: role '' defaults 43681 1727204709.87407: Evaluated conditional (network_state != {}): False 43681 1727204709.87410: when evaluation is False, skipping this task 43681 1727204709.87414: _execute() done 43681 1727204709.87419: dumping result to json 43681 1727204709.87422: done dumping result, returning 43681 1727204709.87429: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9e86-7728-00000000002b] 43681 1727204709.87435: sending task result for task 12b410aa-8751-9e86-7728-00000000002b 43681 1727204709.87530: done sending task result for task 12b410aa-8751-9e86-7728-00000000002b 43681 1727204709.87533: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 43681 1727204709.87588: no more pending results, returning what we have 43681 1727204709.87595: results queue empty 43681 1727204709.87596: checking for any_errors_fatal 43681 1727204709.87609: done checking for any_errors_fatal 43681 1727204709.87610: checking for max_fail_percentage 43681 1727204709.87611: done checking for max_fail_percentage 43681 1727204709.87612: checking to see if all hosts have failed and the running result is not ok 43681 1727204709.87613: done checking to see if all hosts have failed 43681 1727204709.87614: getting the remaining hosts for this loop 43681 1727204709.87618: done getting the remaining hosts for this loop 43681 1727204709.87622: getting the next task for host managed-node3 43681 1727204709.87629: done getting next task for host managed-node3 43681 1727204709.87633: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204709.87636: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204709.87658: getting variables 43681 1727204709.87660: in VariableManager get_vars() 43681 1727204709.87696: Calling all_inventory to load vars for managed-node3 43681 1727204709.87699: Calling groups_inventory to load vars for managed-node3 43681 1727204709.87702: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204709.87711: Calling all_plugins_play to load vars for managed-node3 43681 1727204709.87714: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204709.87720: Calling groups_plugins_play to load vars for managed-node3 43681 1727204709.89036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204709.90626: done with get_vars() 43681 1727204709.90648: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.045) 0:00:17.574 ***** 43681 1727204709.90735: entering _queue_task() for managed-node3/ping 43681 1727204709.90737: Creating lock for ping 43681 1727204709.91008: worker is 1 (out of 1 available) 43681 1727204709.91025: exiting _queue_task() for managed-node3/ping 43681 1727204709.91039: done queuing things up, now waiting for results queue to drain 43681 1727204709.91041: waiting for pending results... 43681 1727204709.91238: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204709.91342: in run() - task 12b410aa-8751-9e86-7728-00000000002c 43681 1727204709.91355: variable 'ansible_search_path' from source: unknown 43681 1727204709.91359: variable 'ansible_search_path' from source: unknown 43681 1727204709.91397: calling self._execute() 43681 1727204709.91471: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.91481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.91495: variable 'omit' from source: magic vars 43681 1727204709.91810: variable 'ansible_distribution_major_version' from source: facts 43681 1727204709.91825: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204709.91828: variable 'omit' from source: magic vars 43681 1727204709.91880: variable 'omit' from source: magic vars 43681 1727204709.91912: variable 'omit' from source: magic vars 43681 1727204709.91952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204709.91984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204709.92005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204709.92023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.92037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204709.92064: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204709.92068: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.92072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.92158: Set connection var ansible_shell_type to sh 43681 1727204709.92166: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204709.92172: Set connection var ansible_timeout to 10 43681 1727204709.92180: Set connection var ansible_pipelining to False 43681 1727204709.92187: Set connection var ansible_connection to ssh 43681 1727204709.92195: Set connection var ansible_shell_executable to /bin/sh 43681 1727204709.92214: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.92220: variable 'ansible_connection' from source: unknown 43681 1727204709.92224: variable 'ansible_module_compression' from source: unknown 43681 1727204709.92226: variable 'ansible_shell_type' from source: unknown 43681 1727204709.92229: variable 'ansible_shell_executable' from source: unknown 43681 1727204709.92231: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204709.92237: variable 'ansible_pipelining' from source: unknown 43681 1727204709.92239: variable 'ansible_timeout' from source: unknown 43681 1727204709.92246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204709.92425: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204709.92434: variable 'omit' from source: magic vars 43681 1727204709.92440: starting attempt loop 43681 1727204709.92443: running the handler 43681 1727204709.92457: _low_level_execute_command(): starting 43681 1727204709.92466: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204709.93025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204709.93029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.93034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.93095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204709.93099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.93153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.94905: stdout chunk (state=3): >>>/root <<< 43681 1727204709.95019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204709.95073: stderr chunk (state=3): >>><<< 43681 1727204709.95079: stdout chunk (state=3): >>><<< 43681 1727204709.95104: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204709.95121: _low_level_execute_command(): starting 43681 1727204709.95126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157 `" && echo ansible-tmp-1727204709.9510407-44295-196135475844157="` echo /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157 `" ) && sleep 0' 43681 1727204709.95600: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204709.95603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204709.95606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204709.95616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204709.95618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204709.95673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204709.95676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204709.95711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204709.97696: stdout chunk (state=3): >>>ansible-tmp-1727204709.9510407-44295-196135475844157=/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157 <<< 43681 1727204709.97809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204709.97856: stderr chunk (state=3): >>><<< 43681 1727204709.97860: stdout chunk (state=3): >>><<< 43681 1727204709.97877: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204709.9510407-44295-196135475844157=/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204709.97927: variable 'ansible_module_compression' from source: unknown 43681 1727204709.97964: ANSIBALLZ: Using lock for ping 43681 1727204709.97968: ANSIBALLZ: Acquiring lock 43681 1727204709.97971: ANSIBALLZ: Lock acquired: 140156135314384 43681 1727204709.97975: ANSIBALLZ: Creating module 43681 1727204710.12404: ANSIBALLZ: Writing module into payload 43681 1727204710.12450: ANSIBALLZ: Writing module 43681 1727204710.12469: ANSIBALLZ: Renaming module 43681 1727204710.12475: ANSIBALLZ: Done creating module 43681 1727204710.12490: variable 'ansible_facts' from source: unknown 43681 1727204710.12542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py 43681 1727204710.12665: Sending initial data 43681 1727204710.12669: Sent initial data (153 bytes) 43681 1727204710.13165: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.13169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.13172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204710.13177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.13232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.13235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.13283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.14999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204710.15018: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204710.15034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204710.15069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp13x9mqmq /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py <<< 43681 1727204710.15078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py" <<< 43681 1727204710.15105: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp13x9mqmq" to remote "/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py" <<< 43681 1727204710.15115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py" <<< 43681 1727204710.15842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.15915: stderr chunk (state=3): >>><<< 43681 1727204710.15921: stdout chunk (state=3): >>><<< 43681 1727204710.15938: done transferring module to remote 43681 1727204710.15949: _low_level_execute_command(): starting 43681 1727204710.15955: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/ /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py && sleep 0' 43681 1727204710.16432: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.16435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204710.16438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204710.16440: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.16446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.16499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.16502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.16541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.18383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.18439: stderr chunk (state=3): >>><<< 43681 1727204710.18442: stdout chunk (state=3): >>><<< 43681 1727204710.18458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204710.18462: _low_level_execute_command(): starting 43681 1727204710.18467: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/AnsiballZ_ping.py && sleep 0' 43681 1727204710.18931: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.18934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.18938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204710.18942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.18995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.18998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.19046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.36047: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 43681 1727204710.37553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204710.37558: stdout chunk (state=3): >>><<< 43681 1727204710.37560: stderr chunk (state=3): >>><<< 43681 1727204710.37696: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204710.37701: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204710.37703: _low_level_execute_command(): starting 43681 1727204710.37706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204709.9510407-44295-196135475844157/ > /dev/null 2>&1 && sleep 0' 43681 1727204710.38312: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204710.38329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204710.38345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.38478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204710.38481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204710.38511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.38576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.40608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.40612: stdout chunk (state=3): >>><<< 43681 1727204710.40618: stderr chunk (state=3): >>><<< 43681 1727204710.40795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204710.40798: handler run complete 43681 1727204710.40801: attempt loop complete, returning result 43681 1727204710.40803: _execute() done 43681 1727204710.40806: dumping result to json 43681 1727204710.40808: done dumping result, returning 43681 1727204710.40810: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9e86-7728-00000000002c] 43681 1727204710.40813: sending task result for task 12b410aa-8751-9e86-7728-00000000002c 43681 1727204710.40882: done sending task result for task 12b410aa-8751-9e86-7728-00000000002c 43681 1727204710.40885: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 43681 1727204710.40954: no more pending results, returning what we have 43681 1727204710.40957: results queue empty 43681 1727204710.40958: checking for any_errors_fatal 43681 1727204710.40967: done checking for any_errors_fatal 43681 1727204710.40968: checking for max_fail_percentage 43681 1727204710.40969: done checking for max_fail_percentage 43681 1727204710.40970: checking to see if all hosts have failed and the running result is not ok 43681 1727204710.40972: done checking to see if all hosts have failed 43681 1727204710.40973: getting the remaining hosts for this loop 43681 1727204710.40974: done getting the remaining hosts for this loop 43681 1727204710.40979: getting the next task for host managed-node3 43681 1727204710.40994: done getting next task for host managed-node3 43681 1727204710.40998: ^ task is: TASK: meta (role_complete) 43681 1727204710.41001: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204710.41014: getting variables 43681 1727204710.41016: in VariableManager get_vars() 43681 1727204710.41058: Calling all_inventory to load vars for managed-node3 43681 1727204710.41062: Calling groups_inventory to load vars for managed-node3 43681 1727204710.41065: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204710.41076: Calling all_plugins_play to load vars for managed-node3 43681 1727204710.41080: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204710.41083: Calling groups_plugins_play to load vars for managed-node3 43681 1727204710.43871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204710.55081: done with get_vars() 43681 1727204710.55213: done getting variables 43681 1727204710.55308: done queuing things up, now waiting for results queue to drain 43681 1727204710.55316: results queue empty 43681 1727204710.55317: checking for any_errors_fatal 43681 1727204710.55321: done checking for any_errors_fatal 43681 1727204710.55322: checking for max_fail_percentage 43681 1727204710.55324: done checking for max_fail_percentage 43681 1727204710.55325: checking to see if all hosts have failed and the running result is not ok 43681 1727204710.55326: done checking to see if all hosts have failed 43681 1727204710.55327: getting the remaining hosts for this loop 43681 1727204710.55328: done getting the remaining hosts for this loop 43681 1727204710.55332: getting the next task for host managed-node3 43681 1727204710.55337: done getting next task for host managed-node3 43681 1727204710.55340: ^ task is: TASK: Get the routing rule for looking up the table 30200 43681 1727204710.55342: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204710.55345: getting variables 43681 1727204710.55346: in VariableManager get_vars() 43681 1727204710.55363: Calling all_inventory to load vars for managed-node3 43681 1727204710.55366: Calling groups_inventory to load vars for managed-node3 43681 1727204710.55369: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204710.55376: Calling all_plugins_play to load vars for managed-node3 43681 1727204710.55379: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204710.55383: Calling groups_plugins_play to load vars for managed-node3 43681 1727204710.57404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204710.60578: done with get_vars() 43681 1727204710.60617: done getting variables 43681 1727204710.60676: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.699) 0:00:18.273 ***** 43681 1727204710.60707: entering _queue_task() for managed-node3/command 43681 1727204710.61304: worker is 1 (out of 1 available) 43681 1727204710.61316: exiting _queue_task() for managed-node3/command 43681 1727204710.61327: done queuing things up, now waiting for results queue to drain 43681 1727204710.61328: waiting for pending results... 43681 1727204710.61571: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30200 43681 1727204710.61577: in run() - task 12b410aa-8751-9e86-7728-00000000005c 43681 1727204710.61583: variable 'ansible_search_path' from source: unknown 43681 1727204710.61629: calling self._execute() 43681 1727204710.61748: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204710.61765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204710.61794: variable 'omit' from source: magic vars 43681 1727204710.62269: variable 'ansible_distribution_major_version' from source: facts 43681 1727204710.62287: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204710.62455: variable 'ansible_distribution_major_version' from source: facts 43681 1727204710.62467: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204710.62479: variable 'omit' from source: magic vars 43681 1727204710.62508: variable 'omit' from source: magic vars 43681 1727204710.62594: variable 'omit' from source: magic vars 43681 1727204710.62619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204710.62675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204710.62704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204710.62731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204710.62791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204710.62801: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204710.62810: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204710.62819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204710.62952: Set connection var ansible_shell_type to sh 43681 1727204710.62972: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204710.62986: Set connection var ansible_timeout to 10 43681 1727204710.63007: Set connection var ansible_pipelining to False 43681 1727204710.63078: Set connection var ansible_connection to ssh 43681 1727204710.63082: Set connection var ansible_shell_executable to /bin/sh 43681 1727204710.63084: variable 'ansible_shell_executable' from source: unknown 43681 1727204710.63087: variable 'ansible_connection' from source: unknown 43681 1727204710.63091: variable 'ansible_module_compression' from source: unknown 43681 1727204710.63094: variable 'ansible_shell_type' from source: unknown 43681 1727204710.63096: variable 'ansible_shell_executable' from source: unknown 43681 1727204710.63102: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204710.63112: variable 'ansible_pipelining' from source: unknown 43681 1727204710.63124: variable 'ansible_timeout' from source: unknown 43681 1727204710.63134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204710.63315: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204710.63408: variable 'omit' from source: magic vars 43681 1727204710.63415: starting attempt loop 43681 1727204710.63418: running the handler 43681 1727204710.63421: _low_level_execute_command(): starting 43681 1727204710.63423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204710.64184: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204710.64301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.64349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.64425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.66216: stdout chunk (state=3): >>>/root <<< 43681 1727204710.66407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.66423: stdout chunk (state=3): >>><<< 43681 1727204710.66438: stderr chunk (state=3): >>><<< 43681 1727204710.66464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204710.66483: _low_level_execute_command(): starting 43681 1727204710.66497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928 `" && echo ansible-tmp-1727204710.6647043-44312-97769673138928="` echo /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928 `" ) && sleep 0' 43681 1727204710.67143: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204710.67157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204710.67173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.67212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.67232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204710.67347: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204710.67375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.67449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.69577: stdout chunk (state=3): >>>ansible-tmp-1727204710.6647043-44312-97769673138928=/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928 <<< 43681 1727204710.69684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.69688: stdout chunk (state=3): >>><<< 43681 1727204710.69694: stderr chunk (state=3): >>><<< 43681 1727204710.69719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204710.6647043-44312-97769673138928=/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204710.69825: variable 'ansible_module_compression' from source: unknown 43681 1727204710.69829: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204710.69875: variable 'ansible_facts' from source: unknown 43681 1727204710.69976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py 43681 1727204710.70178: Sending initial data 43681 1727204710.70181: Sent initial data (155 bytes) 43681 1727204710.70822: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.70907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.70952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.70969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204710.70994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.71057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.72684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204710.72733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204710.72806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp2vaqhtvm /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py <<< 43681 1727204710.72810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py" <<< 43681 1727204710.72841: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp2vaqhtvm" to remote "/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py" <<< 43681 1727204710.73954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.74082: stderr chunk (state=3): >>><<< 43681 1727204710.74086: stdout chunk (state=3): >>><<< 43681 1727204710.74091: done transferring module to remote 43681 1727204710.74094: _low_level_execute_command(): starting 43681 1727204710.74097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/ /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py && sleep 0' 43681 1727204710.74684: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204710.74703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204710.74720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.74751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204710.74857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.74879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.74903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204710.74926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.74987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.76899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204710.76914: stdout chunk (state=3): >>><<< 43681 1727204710.76930: stderr chunk (state=3): >>><<< 43681 1727204710.76951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204710.76960: _low_level_execute_command(): starting 43681 1727204710.76969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/AnsiballZ_command.py && sleep 0' 43681 1727204710.77608: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204710.77623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204710.77637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.77700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.77767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.77784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204710.77813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.77887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204710.95907: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos 0x08 lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:05:10.952420", "end": "2024-09-24 15:05:10.957747", "delta": "0:00:00.005327", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204710.97641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204710.97685: stderr chunk (state=3): >>><<< 43681 1727204710.97689: stdout chunk (state=3): >>><<< 43681 1727204710.97712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos 0x08 lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:05:10.952420", "end": "2024-09-24 15:05:10.957747", "delta": "0:00:00.005327", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204710.97754: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204710.97766: _low_level_execute_command(): starting 43681 1727204710.97771: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204710.6647043-44312-97769673138928/ > /dev/null 2>&1 && sleep 0' 43681 1727204710.98236: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204710.98242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.98249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204710.98251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204710.98304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204710.98311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204710.98363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.00293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.00340: stderr chunk (state=3): >>><<< 43681 1727204711.00346: stdout chunk (state=3): >>><<< 43681 1727204711.00364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.00371: handler run complete 43681 1727204711.00395: Evaluated conditional (False): False 43681 1727204711.00406: attempt loop complete, returning result 43681 1727204711.00409: _execute() done 43681 1727204711.00412: dumping result to json 43681 1727204711.00421: done dumping result, returning 43681 1727204711.00430: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30200 [12b410aa-8751-9e86-7728-00000000005c] 43681 1727204711.00436: sending task result for task 12b410aa-8751-9e86-7728-00000000005c 43681 1727204711.00550: done sending task result for task 12b410aa-8751-9e86-7728-00000000005c 43681 1727204711.00553: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.005327", "end": "2024-09-24 15:05:10.957747", "rc": 0, "start": "2024-09-24 15:05:10.952420" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos 0x08 lookup 30200 proto static 43681 1727204711.00663: no more pending results, returning what we have 43681 1727204711.00667: results queue empty 43681 1727204711.00669: checking for any_errors_fatal 43681 1727204711.00671: done checking for any_errors_fatal 43681 1727204711.00672: checking for max_fail_percentage 43681 1727204711.00676: done checking for max_fail_percentage 43681 1727204711.00677: checking to see if all hosts have failed and the running result is not ok 43681 1727204711.00678: done checking to see if all hosts have failed 43681 1727204711.00678: getting the remaining hosts for this loop 43681 1727204711.00680: done getting the remaining hosts for this loop 43681 1727204711.00685: getting the next task for host managed-node3 43681 1727204711.00692: done getting next task for host managed-node3 43681 1727204711.00695: ^ task is: TASK: Get the routing rule for looking up the table 30400 43681 1727204711.00698: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204711.00701: getting variables 43681 1727204711.00703: in VariableManager get_vars() 43681 1727204711.00741: Calling all_inventory to load vars for managed-node3 43681 1727204711.00744: Calling groups_inventory to load vars for managed-node3 43681 1727204711.00747: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204711.00759: Calling all_plugins_play to load vars for managed-node3 43681 1727204711.00761: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204711.00765: Calling groups_plugins_play to load vars for managed-node3 43681 1727204711.02148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204711.03735: done with get_vars() 43681 1727204711.03760: done getting variables 43681 1727204711.03813: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.431) 0:00:18.705 ***** 43681 1727204711.03839: entering _queue_task() for managed-node3/command 43681 1727204711.04107: worker is 1 (out of 1 available) 43681 1727204711.04123: exiting _queue_task() for managed-node3/command 43681 1727204711.04138: done queuing things up, now waiting for results queue to drain 43681 1727204711.04140: waiting for pending results... 43681 1727204711.04363: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30400 43681 1727204711.04437: in run() - task 12b410aa-8751-9e86-7728-00000000005d 43681 1727204711.04450: variable 'ansible_search_path' from source: unknown 43681 1727204711.04483: calling self._execute() 43681 1727204711.04572: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.04580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.04598: variable 'omit' from source: magic vars 43681 1727204711.04939: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.04951: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204711.05053: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.05060: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204711.05068: variable 'omit' from source: magic vars 43681 1727204711.05092: variable 'omit' from source: magic vars 43681 1727204711.05129: variable 'omit' from source: magic vars 43681 1727204711.05167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204711.05202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204711.05224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204711.05242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.05256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.05283: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204711.05286: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.05295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.05380: Set connection var ansible_shell_type to sh 43681 1727204711.05387: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204711.05396: Set connection var ansible_timeout to 10 43681 1727204711.05406: Set connection var ansible_pipelining to False 43681 1727204711.05412: Set connection var ansible_connection to ssh 43681 1727204711.05421: Set connection var ansible_shell_executable to /bin/sh 43681 1727204711.05441: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.05444: variable 'ansible_connection' from source: unknown 43681 1727204711.05447: variable 'ansible_module_compression' from source: unknown 43681 1727204711.05451: variable 'ansible_shell_type' from source: unknown 43681 1727204711.05454: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.05459: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.05463: variable 'ansible_pipelining' from source: unknown 43681 1727204711.05471: variable 'ansible_timeout' from source: unknown 43681 1727204711.05474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.05599: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204711.05610: variable 'omit' from source: magic vars 43681 1727204711.05617: starting attempt loop 43681 1727204711.05625: running the handler 43681 1727204711.05639: _low_level_execute_command(): starting 43681 1727204711.05646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204711.06207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.06212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.06218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.06274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.06279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.06323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.08079: stdout chunk (state=3): >>>/root <<< 43681 1727204711.08188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.08251: stderr chunk (state=3): >>><<< 43681 1727204711.08255: stdout chunk (state=3): >>><<< 43681 1727204711.08279: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.08293: _low_level_execute_command(): starting 43681 1727204711.08301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245 `" && echo ansible-tmp-1727204711.0827925-44329-122359637130245="` echo /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245 `" ) && sleep 0' 43681 1727204711.08769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.08800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.08804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.08814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.08863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.08871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.08913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.10936: stdout chunk (state=3): >>>ansible-tmp-1727204711.0827925-44329-122359637130245=/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245 <<< 43681 1727204711.11053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.11107: stderr chunk (state=3): >>><<< 43681 1727204711.11111: stdout chunk (state=3): >>><<< 43681 1727204711.11132: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204711.0827925-44329-122359637130245=/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.11162: variable 'ansible_module_compression' from source: unknown 43681 1727204711.11210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204711.11251: variable 'ansible_facts' from source: unknown 43681 1727204711.11306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py 43681 1727204711.11431: Sending initial data 43681 1727204711.11435: Sent initial data (156 bytes) 43681 1727204711.11880: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.11921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.11924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204711.11929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.11932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.11983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.11991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.12027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.13704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204711.13712: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204711.13739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204711.13777: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp8ki_jxng /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py <<< 43681 1727204711.13780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py" <<< 43681 1727204711.13808: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp8ki_jxng" to remote "/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py" <<< 43681 1727204711.14571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.14646: stderr chunk (state=3): >>><<< 43681 1727204711.14649: stdout chunk (state=3): >>><<< 43681 1727204711.14669: done transferring module to remote 43681 1727204711.14681: _low_level_execute_command(): starting 43681 1727204711.14686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/ /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py && sleep 0' 43681 1727204711.15169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.15174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204711.15177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.15179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204711.15181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.15234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.15238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.15276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.17138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.17191: stderr chunk (state=3): >>><<< 43681 1727204711.17195: stdout chunk (state=3): >>><<< 43681 1727204711.17212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.17215: _low_level_execute_command(): starting 43681 1727204711.17223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/AnsiballZ_command.py && sleep 0' 43681 1727204711.17668: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.17708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204711.17712: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.17715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204711.17721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.17767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.17770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.17819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.35711: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:05:11.352205", "end": "2024-09-24 15:05:11.355949", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204711.37387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204711.37466: stderr chunk (state=3): >>><<< 43681 1727204711.37470: stdout chunk (state=3): >>><<< 43681 1727204711.37491: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:05:11.352205", "end": "2024-09-24 15:05:11.355949", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204711.37610: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204711.37615: _low_level_execute_command(): starting 43681 1727204711.37620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204711.0827925-44329-122359637130245/ > /dev/null 2>&1 && sleep 0' 43681 1727204711.38260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.38264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.38267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.38270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.38276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204711.38283: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204711.38404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.38445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.38498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.40430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.40480: stderr chunk (state=3): >>><<< 43681 1727204711.40483: stdout chunk (state=3): >>><<< 43681 1727204711.40501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.40509: handler run complete 43681 1727204711.40535: Evaluated conditional (False): False 43681 1727204711.40549: attempt loop complete, returning result 43681 1727204711.40552: _execute() done 43681 1727204711.40555: dumping result to json 43681 1727204711.40575: done dumping result, returning 43681 1727204711.40603: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30400 [12b410aa-8751-9e86-7728-00000000005d] 43681 1727204711.40606: sending task result for task 12b410aa-8751-9e86-7728-00000000005d 43681 1727204711.40736: done sending task result for task 12b410aa-8751-9e86-7728-00000000005d 43681 1727204711.40739: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.003744", "end": "2024-09-24 15:05:11.355949", "rc": 0, "start": "2024-09-24 15:05:11.352205" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 43681 1727204711.40875: no more pending results, returning what we have 43681 1727204711.40879: results queue empty 43681 1727204711.40880: checking for any_errors_fatal 43681 1727204711.40888: done checking for any_errors_fatal 43681 1727204711.40891: checking for max_fail_percentage 43681 1727204711.40893: done checking for max_fail_percentage 43681 1727204711.40894: checking to see if all hosts have failed and the running result is not ok 43681 1727204711.40895: done checking to see if all hosts have failed 43681 1727204711.40896: getting the remaining hosts for this loop 43681 1727204711.40898: done getting the remaining hosts for this loop 43681 1727204711.40902: getting the next task for host managed-node3 43681 1727204711.40908: done getting next task for host managed-node3 43681 1727204711.40911: ^ task is: TASK: Get the routing rule for looking up the table 30600 43681 1727204711.40913: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204711.40916: getting variables 43681 1727204711.40918: in VariableManager get_vars() 43681 1727204711.40954: Calling all_inventory to load vars for managed-node3 43681 1727204711.40957: Calling groups_inventory to load vars for managed-node3 43681 1727204711.40959: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204711.40971: Calling all_plugins_play to load vars for managed-node3 43681 1727204711.40974: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204711.40977: Calling groups_plugins_play to load vars for managed-node3 43681 1727204711.43203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204711.44942: done with get_vars() 43681 1727204711.44965: done getting variables 43681 1727204711.45021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.412) 0:00:19.117 ***** 43681 1727204711.45059: entering _queue_task() for managed-node3/command 43681 1727204711.45441: worker is 1 (out of 1 available) 43681 1727204711.45457: exiting _queue_task() for managed-node3/command 43681 1727204711.45471: done queuing things up, now waiting for results queue to drain 43681 1727204711.45472: waiting for pending results... 43681 1727204711.46118: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30600 43681 1727204711.46124: in run() - task 12b410aa-8751-9e86-7728-00000000005e 43681 1727204711.46128: variable 'ansible_search_path' from source: unknown 43681 1727204711.46132: calling self._execute() 43681 1727204711.46135: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.46137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.46141: variable 'omit' from source: magic vars 43681 1727204711.46488: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.46500: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204711.46603: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.46609: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204711.46619: variable 'omit' from source: magic vars 43681 1727204711.46635: variable 'omit' from source: magic vars 43681 1727204711.46670: variable 'omit' from source: magic vars 43681 1727204711.46708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204711.46742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204711.46766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204711.46783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.46796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.46826: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204711.46830: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.46833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.46923: Set connection var ansible_shell_type to sh 43681 1727204711.46927: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204711.46935: Set connection var ansible_timeout to 10 43681 1727204711.46944: Set connection var ansible_pipelining to False 43681 1727204711.46950: Set connection var ansible_connection to ssh 43681 1727204711.46957: Set connection var ansible_shell_executable to /bin/sh 43681 1727204711.46978: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.46982: variable 'ansible_connection' from source: unknown 43681 1727204711.46987: variable 'ansible_module_compression' from source: unknown 43681 1727204711.46989: variable 'ansible_shell_type' from source: unknown 43681 1727204711.46997: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.47000: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.47002: variable 'ansible_pipelining' from source: unknown 43681 1727204711.47006: variable 'ansible_timeout' from source: unknown 43681 1727204711.47013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.47134: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204711.47144: variable 'omit' from source: magic vars 43681 1727204711.47151: starting attempt loop 43681 1727204711.47154: running the handler 43681 1727204711.47170: _low_level_execute_command(): starting 43681 1727204711.47177: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204711.47720: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.47724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.47729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204711.47733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.47782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.47793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.47833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.49561: stdout chunk (state=3): >>>/root <<< 43681 1727204711.49673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.49729: stderr chunk (state=3): >>><<< 43681 1727204711.49732: stdout chunk (state=3): >>><<< 43681 1727204711.49754: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.49766: _low_level_execute_command(): starting 43681 1727204711.49772: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106 `" && echo ansible-tmp-1727204711.497544-44346-82105790158106="` echo /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106 `" ) && sleep 0' 43681 1727204711.50245: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.50248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.50260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204711.50263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.50308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.50312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.50362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.52409: stdout chunk (state=3): >>>ansible-tmp-1727204711.497544-44346-82105790158106=/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106 <<< 43681 1727204711.52526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.52581: stderr chunk (state=3): >>><<< 43681 1727204711.52585: stdout chunk (state=3): >>><<< 43681 1727204711.52607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204711.497544-44346-82105790158106=/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.52636: variable 'ansible_module_compression' from source: unknown 43681 1727204711.52682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204711.52722: variable 'ansible_facts' from source: unknown 43681 1727204711.52776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py 43681 1727204711.52897: Sending initial data 43681 1727204711.52901: Sent initial data (154 bytes) 43681 1727204711.53373: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.53377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204711.53380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.53382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.53440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.53445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.53483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.55171: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204711.55208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204711.55249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpvk3ovmzl /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py <<< 43681 1727204711.55253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py" <<< 43681 1727204711.55282: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpvk3ovmzl" to remote "/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py" <<< 43681 1727204711.56050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.56122: stderr chunk (state=3): >>><<< 43681 1727204711.56126: stdout chunk (state=3): >>><<< 43681 1727204711.56146: done transferring module to remote 43681 1727204711.56157: _low_level_execute_command(): starting 43681 1727204711.56162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/ /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py && sleep 0' 43681 1727204711.56642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.56645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204711.56648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.56652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.56654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.56709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.56717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.56756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.58896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.58900: stdout chunk (state=3): >>><<< 43681 1727204711.58903: stderr chunk (state=3): >>><<< 43681 1727204711.58906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.58909: _low_level_execute_command(): starting 43681 1727204711.58912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/AnsiballZ_command.py && sleep 0' 43681 1727204711.59537: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.59541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.59552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.59594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.59609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.59692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.59722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.59840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.77994: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:05:11.775009", "end": "2024-09-24 15:05:11.778710", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204711.79819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204711.79837: stderr chunk (state=3): >>><<< 43681 1727204711.79847: stdout chunk (state=3): >>><<< 43681 1727204711.79879: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:05:11.775009", "end": "2024-09-24 15:05:11.778710", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204711.79944: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204711.79963: _low_level_execute_command(): starting 43681 1727204711.79975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204711.497544-44346-82105790158106/ > /dev/null 2>&1 && sleep 0' 43681 1727204711.80647: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.80666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.80684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.80708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.80732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204711.80753: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204711.80770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.80809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.80902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.80933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.81006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.83014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.83045: stderr chunk (state=3): >>><<< 43681 1727204711.83060: stdout chunk (state=3): >>><<< 43681 1727204711.83082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.83098: handler run complete 43681 1727204711.83150: Evaluated conditional (False): False 43681 1727204711.83170: attempt loop complete, returning result 43681 1727204711.83178: _execute() done 43681 1727204711.83186: dumping result to json 43681 1727204711.83198: done dumping result, returning 43681 1727204711.83211: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 30600 [12b410aa-8751-9e86-7728-00000000005e] 43681 1727204711.83225: sending task result for task 12b410aa-8751-9e86-7728-00000000005e ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.003701", "end": "2024-09-24 15:05:11.778710", "rc": 0, "start": "2024-09-24 15:05:11.775009" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 43681 1727204711.83568: no more pending results, returning what we have 43681 1727204711.83572: results queue empty 43681 1727204711.83574: checking for any_errors_fatal 43681 1727204711.83586: done checking for any_errors_fatal 43681 1727204711.83587: checking for max_fail_percentage 43681 1727204711.83592: done checking for max_fail_percentage 43681 1727204711.83593: checking to see if all hosts have failed and the running result is not ok 43681 1727204711.83594: done checking to see if all hosts have failed 43681 1727204711.83595: getting the remaining hosts for this loop 43681 1727204711.83596: done getting the remaining hosts for this loop 43681 1727204711.83602: getting the next task for host managed-node3 43681 1727204711.83608: done getting next task for host managed-node3 43681 1727204711.83612: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 43681 1727204711.83614: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204711.83621: getting variables 43681 1727204711.83623: in VariableManager get_vars() 43681 1727204711.83666: Calling all_inventory to load vars for managed-node3 43681 1727204711.83669: Calling groups_inventory to load vars for managed-node3 43681 1727204711.83672: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204711.83685: Calling all_plugins_play to load vars for managed-node3 43681 1727204711.83904: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204711.83912: done sending task result for task 12b410aa-8751-9e86-7728-00000000005e 43681 1727204711.83918: WORKER PROCESS EXITING 43681 1727204711.83924: Calling groups_plugins_play to load vars for managed-node3 43681 1727204711.86362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204711.89498: done with get_vars() 43681 1727204711.89550: done getting variables 43681 1727204711.89630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.446) 0:00:19.563 ***** 43681 1727204711.89669: entering _queue_task() for managed-node3/command 43681 1727204711.90075: worker is 1 (out of 1 available) 43681 1727204711.90194: exiting _queue_task() for managed-node3/command 43681 1727204711.90212: done queuing things up, now waiting for results queue to drain 43681 1727204711.90215: waiting for pending results... 43681 1727204711.90459: running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 'custom' 43681 1727204711.90575: in run() - task 12b410aa-8751-9e86-7728-00000000005f 43681 1727204711.90600: variable 'ansible_search_path' from source: unknown 43681 1727204711.90653: calling self._execute() 43681 1727204711.90775: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.90792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.90818: variable 'omit' from source: magic vars 43681 1727204711.91313: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.91495: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204711.91499: variable 'ansible_distribution_major_version' from source: facts 43681 1727204711.91518: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204711.91531: variable 'omit' from source: magic vars 43681 1727204711.91559: variable 'omit' from source: magic vars 43681 1727204711.91620: variable 'omit' from source: magic vars 43681 1727204711.91672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204711.91730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204711.91757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204711.91783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.91803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204711.91852: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204711.91861: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.91870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.92008: Set connection var ansible_shell_type to sh 43681 1727204711.92026: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204711.92053: Set connection var ansible_timeout to 10 43681 1727204711.92060: Set connection var ansible_pipelining to False 43681 1727204711.92163: Set connection var ansible_connection to ssh 43681 1727204711.92166: Set connection var ansible_shell_executable to /bin/sh 43681 1727204711.92169: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.92171: variable 'ansible_connection' from source: unknown 43681 1727204711.92174: variable 'ansible_module_compression' from source: unknown 43681 1727204711.92176: variable 'ansible_shell_type' from source: unknown 43681 1727204711.92178: variable 'ansible_shell_executable' from source: unknown 43681 1727204711.92180: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204711.92182: variable 'ansible_pipelining' from source: unknown 43681 1727204711.92184: variable 'ansible_timeout' from source: unknown 43681 1727204711.92186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204711.92345: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204711.92367: variable 'omit' from source: magic vars 43681 1727204711.92385: starting attempt loop 43681 1727204711.92395: running the handler 43681 1727204711.92419: _low_level_execute_command(): starting 43681 1727204711.92432: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204711.93122: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.93144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.93152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.93169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.93184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204711.93337: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204711.93344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.93347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.93383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.95147: stdout chunk (state=3): >>>/root <<< 43681 1727204711.95339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.95420: stderr chunk (state=3): >>><<< 43681 1727204711.95449: stdout chunk (state=3): >>><<< 43681 1727204711.95472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.95590: _low_level_execute_command(): starting 43681 1727204711.95595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627 `" && echo ansible-tmp-1727204711.9547975-44365-246724193600627="` echo /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627 `" ) && sleep 0' 43681 1727204711.96139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.96156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204711.96175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204711.96199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204711.96228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204711.96314: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204711.96352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204711.96377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204711.96396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204711.96472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204711.98569: stdout chunk (state=3): >>>ansible-tmp-1727204711.9547975-44365-246724193600627=/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627 <<< 43681 1727204711.98704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204711.98797: stderr chunk (state=3): >>><<< 43681 1727204711.98800: stdout chunk (state=3): >>><<< 43681 1727204711.98828: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204711.9547975-44365-246724193600627=/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204711.98994: variable 'ansible_module_compression' from source: unknown 43681 1727204711.98998: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204711.99000: variable 'ansible_facts' from source: unknown 43681 1727204711.99070: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py 43681 1727204711.99245: Sending initial data 43681 1727204711.99344: Sent initial data (156 bytes) 43681 1727204711.99921: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204711.99937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.00005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.00071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.00091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.00122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.00191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.01942: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 43681 1727204712.01946: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204712.01964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204712.02025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpde2bnu44 /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py <<< 43681 1727204712.02315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py" <<< 43681 1727204712.02319: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpde2bnu44" to remote "/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py" <<< 43681 1727204712.03164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.03283: stderr chunk (state=3): >>><<< 43681 1727204712.03294: stdout chunk (state=3): >>><<< 43681 1727204712.03331: done transferring module to remote 43681 1727204712.03345: _low_level_execute_command(): starting 43681 1727204712.03353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/ /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py && sleep 0' 43681 1727204712.04043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.04054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.04075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.04115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.04183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.04213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.04281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.04285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.04315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.06210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.06288: stderr chunk (state=3): >>><<< 43681 1727204712.06295: stdout chunk (state=3): >>><<< 43681 1727204712.06324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.06327: _low_level_execute_command(): starting 43681 1727204712.06496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/AnsiballZ_command.py && sleep 0' 43681 1727204712.07009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.07022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.07035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.07057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.07071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.07079: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204712.07104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204712.07171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.07214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.07237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.07247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.07325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.24967: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:05:12.244787", "end": "2024-09-24 15:05:12.248403", "delta": "0:00:00.003616", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204712.26752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204712.26756: stdout chunk (state=3): >>><<< 43681 1727204712.26759: stderr chunk (state=3): >>><<< 43681 1727204712.26781: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:05:12.244787", "end": "2024-09-24 15:05:12.248403", "delta": "0:00:00.003616", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204712.26897: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204712.26902: _low_level_execute_command(): starting 43681 1727204712.26906: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204711.9547975-44365-246724193600627/ > /dev/null 2>&1 && sleep 0' 43681 1727204712.27564: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.27585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.27606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.27649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.27668: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204712.27711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204712.27728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.27813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.27840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.27915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.29918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.29923: stdout chunk (state=3): >>><<< 43681 1727204712.29937: stderr chunk (state=3): >>><<< 43681 1727204712.29963: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.29972: handler run complete 43681 1727204712.30006: Evaluated conditional (False): False 43681 1727204712.30021: attempt loop complete, returning result 43681 1727204712.30024: _execute() done 43681 1727204712.30030: dumping result to json 43681 1727204712.30045: done dumping result, returning 43681 1727204712.30056: done running TaskExecutor() for managed-node3/TASK: Get the routing rule for looking up the table 'custom' [12b410aa-8751-9e86-7728-00000000005f] 43681 1727204712.30063: sending task result for task 12b410aa-8751-9e86-7728-00000000005f 43681 1727204712.30193: done sending task result for task 12b410aa-8751-9e86-7728-00000000005f 43681 1727204712.30196: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.003616", "end": "2024-09-24 15:05:12.248403", "rc": 0, "start": "2024-09-24 15:05:12.244787" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 43681 1727204712.30302: no more pending results, returning what we have 43681 1727204712.30307: results queue empty 43681 1727204712.30308: checking for any_errors_fatal 43681 1727204712.30319: done checking for any_errors_fatal 43681 1727204712.30320: checking for max_fail_percentage 43681 1727204712.30323: done checking for max_fail_percentage 43681 1727204712.30324: checking to see if all hosts have failed and the running result is not ok 43681 1727204712.30325: done checking to see if all hosts have failed 43681 1727204712.30326: getting the remaining hosts for this loop 43681 1727204712.30328: done getting the remaining hosts for this loop 43681 1727204712.30333: getting the next task for host managed-node3 43681 1727204712.30340: done getting next task for host managed-node3 43681 1727204712.30343: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 43681 1727204712.30345: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204712.30350: getting variables 43681 1727204712.30352: in VariableManager get_vars() 43681 1727204712.30608: Calling all_inventory to load vars for managed-node3 43681 1727204712.30612: Calling groups_inventory to load vars for managed-node3 43681 1727204712.30615: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204712.30627: Calling all_plugins_play to load vars for managed-node3 43681 1727204712.30631: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204712.30634: Calling groups_plugins_play to load vars for managed-node3 43681 1727204712.33167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204712.36285: done with get_vars() 43681 1727204712.36330: done getting variables 43681 1727204712.36410: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204712.36560: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.469) 0:00:20.032 ***** 43681 1727204712.36595: entering _queue_task() for managed-node3/command 43681 1727204712.37080: worker is 1 (out of 1 available) 43681 1727204712.37095: exiting _queue_task() for managed-node3/command 43681 1727204712.37107: done queuing things up, now waiting for results queue to drain 43681 1727204712.37109: waiting for pending results... 43681 1727204712.37509: running TaskExecutor() for managed-node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" 43681 1727204712.37519: in run() - task 12b410aa-8751-9e86-7728-000000000060 43681 1727204712.37523: variable 'ansible_search_path' from source: unknown 43681 1727204712.37527: calling self._execute() 43681 1727204712.37624: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.37628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.37641: variable 'omit' from source: magic vars 43681 1727204712.38104: variable 'ansible_distribution_major_version' from source: facts 43681 1727204712.38120: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204712.38130: variable 'omit' from source: magic vars 43681 1727204712.38158: variable 'omit' from source: magic vars 43681 1727204712.38293: variable 'interface' from source: set_fact 43681 1727204712.38394: variable 'omit' from source: magic vars 43681 1727204712.38398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204712.38420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204712.38442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204712.38471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204712.38484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204712.38522: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204712.38526: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.38531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.38658: Set connection var ansible_shell_type to sh 43681 1727204712.38674: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204712.38682: Set connection var ansible_timeout to 10 43681 1727204712.38695: Set connection var ansible_pipelining to False 43681 1727204712.38894: Set connection var ansible_connection to ssh 43681 1727204712.38898: Set connection var ansible_shell_executable to /bin/sh 43681 1727204712.38901: variable 'ansible_shell_executable' from source: unknown 43681 1727204712.38904: variable 'ansible_connection' from source: unknown 43681 1727204712.38907: variable 'ansible_module_compression' from source: unknown 43681 1727204712.38909: variable 'ansible_shell_type' from source: unknown 43681 1727204712.38912: variable 'ansible_shell_executable' from source: unknown 43681 1727204712.38914: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.38919: variable 'ansible_pipelining' from source: unknown 43681 1727204712.38922: variable 'ansible_timeout' from source: unknown 43681 1727204712.38924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.38940: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204712.38955: variable 'omit' from source: magic vars 43681 1727204712.38962: starting attempt loop 43681 1727204712.38965: running the handler 43681 1727204712.38984: _low_level_execute_command(): starting 43681 1727204712.38999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204712.39786: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204712.39815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.39839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.39914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.41647: stdout chunk (state=3): >>>/root <<< 43681 1727204712.41847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.41851: stdout chunk (state=3): >>><<< 43681 1727204712.41854: stderr chunk (state=3): >>><<< 43681 1727204712.41876: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.41900: _low_level_execute_command(): starting 43681 1727204712.41995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112 `" && echo ansible-tmp-1727204712.418832-44378-234034991863112="` echo /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112 `" ) && sleep 0' 43681 1727204712.42561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.42578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.42598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.42623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.42643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.42732: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.42744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.42755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.42793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.42878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.44839: stdout chunk (state=3): >>>ansible-tmp-1727204712.418832-44378-234034991863112=/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112 <<< 43681 1727204712.44967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.45019: stderr chunk (state=3): >>><<< 43681 1727204712.45022: stdout chunk (state=3): >>><<< 43681 1727204712.45058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204712.418832-44378-234034991863112=/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.45070: variable 'ansible_module_compression' from source: unknown 43681 1727204712.45128: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204712.45164: variable 'ansible_facts' from source: unknown 43681 1727204712.45429: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py 43681 1727204712.45464: Sending initial data 43681 1727204712.45475: Sent initial data (155 bytes) 43681 1727204712.46060: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.46133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.46151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.46208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.46249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.47863: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204712.47896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204712.47930: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmphdxr8ejs /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py <<< 43681 1727204712.47937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py" <<< 43681 1727204712.47966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmphdxr8ejs" to remote "/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py" <<< 43681 1727204712.48915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.48919: stderr chunk (state=3): >>><<< 43681 1727204712.48935: stdout chunk (state=3): >>><<< 43681 1727204712.48952: done transferring module to remote 43681 1727204712.48975: _low_level_execute_command(): starting 43681 1727204712.48979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/ /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py && sleep 0' 43681 1727204712.49592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.49728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.49732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.49735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.49737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.49762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.49766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.49836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.51702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.51999: stderr chunk (state=3): >>><<< 43681 1727204712.52002: stdout chunk (state=3): >>><<< 43681 1727204712.52010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.52012: _low_level_execute_command(): starting 43681 1727204712.52015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/AnsiballZ_command.py && sleep 0' 43681 1727204712.52364: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.52374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.52386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.52405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.52417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.52433: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204712.52440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.52459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204712.52462: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204712.52471: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204712.52480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.52494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.52512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.52524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.52532: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204712.52543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.52627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.52640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.52662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.52728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.71785: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:05:12.698805", "end": "2024-09-24 15:05:12.716569", "delta": "0:00:00.017764", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204712.73479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204712.73544: stderr chunk (state=3): >>><<< 43681 1727204712.73549: stdout chunk (state=3): >>><<< 43681 1727204712.73569: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:05:12.698805", "end": "2024-09-24 15:05:12.716569", "delta": "0:00:00.017764", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204712.73629: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204712.73637: _low_level_execute_command(): starting 43681 1727204712.73642: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204712.418832-44378-234034991863112/ > /dev/null 2>&1 && sleep 0' 43681 1727204712.74133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.74136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.74139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.74142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.74193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.74200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.74244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.76137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.76181: stderr chunk (state=3): >>><<< 43681 1727204712.76185: stdout chunk (state=3): >>><<< 43681 1727204712.76201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.76209: handler run complete 43681 1727204712.76237: Evaluated conditional (False): False 43681 1727204712.76248: attempt loop complete, returning result 43681 1727204712.76253: _execute() done 43681 1727204712.76255: dumping result to json 43681 1727204712.76262: done dumping result, returning 43681 1727204712.76271: done running TaskExecutor() for managed-node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" [12b410aa-8751-9e86-7728-000000000060] 43681 1727204712.76277: sending task result for task 12b410aa-8751-9e86-7728-000000000060 43681 1727204712.76381: done sending task result for task 12b410aa-8751-9e86-7728-000000000060 43681 1727204712.76384: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017764", "end": "2024-09-24 15:05:12.716569", "rc": 0, "start": "2024-09-24 15:05:12.698805" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 43681 1727204712.76474: no more pending results, returning what we have 43681 1727204712.76478: results queue empty 43681 1727204712.76479: checking for any_errors_fatal 43681 1727204712.76498: done checking for any_errors_fatal 43681 1727204712.76499: checking for max_fail_percentage 43681 1727204712.76502: done checking for max_fail_percentage 43681 1727204712.76502: checking to see if all hosts have failed and the running result is not ok 43681 1727204712.76503: done checking to see if all hosts have failed 43681 1727204712.76504: getting the remaining hosts for this loop 43681 1727204712.76506: done getting the remaining hosts for this loop 43681 1727204712.76510: getting the next task for host managed-node3 43681 1727204712.76516: done getting next task for host managed-node3 43681 1727204712.76520: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 43681 1727204712.76522: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204712.76527: getting variables 43681 1727204712.76528: in VariableManager get_vars() 43681 1727204712.76567: Calling all_inventory to load vars for managed-node3 43681 1727204712.76570: Calling groups_inventory to load vars for managed-node3 43681 1727204712.76573: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204712.76584: Calling all_plugins_play to load vars for managed-node3 43681 1727204712.76588: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204712.76594: Calling groups_plugins_play to load vars for managed-node3 43681 1727204712.77999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204712.81033: done with get_vars() 43681 1727204712.81069: done getting variables 43681 1727204712.81141: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204712.81280: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.447) 0:00:20.479 ***** 43681 1727204712.81316: entering _queue_task() for managed-node3/command 43681 1727204712.81671: worker is 1 (out of 1 available) 43681 1727204712.81686: exiting _queue_task() for managed-node3/command 43681 1727204712.81701: done queuing things up, now waiting for results queue to drain 43681 1727204712.81702: waiting for pending results... 43681 1727204712.82113: running TaskExecutor() for managed-node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" 43681 1727204712.82131: in run() - task 12b410aa-8751-9e86-7728-000000000061 43681 1727204712.82152: variable 'ansible_search_path' from source: unknown 43681 1727204712.82204: calling self._execute() 43681 1727204712.82318: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.82336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.82354: variable 'omit' from source: magic vars 43681 1727204712.82794: variable 'ansible_distribution_major_version' from source: facts 43681 1727204712.82814: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204712.82828: variable 'omit' from source: magic vars 43681 1727204712.82869: variable 'omit' from source: magic vars 43681 1727204712.82993: variable 'interface' from source: set_fact 43681 1727204712.83087: variable 'omit' from source: magic vars 43681 1727204712.83092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204712.83122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204712.83154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204712.83180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204712.83202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204712.83241: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204712.83250: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.83259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.83384: Set connection var ansible_shell_type to sh 43681 1727204712.83400: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204712.83417: Set connection var ansible_timeout to 10 43681 1727204712.83434: Set connection var ansible_pipelining to False 43681 1727204712.83448: Set connection var ansible_connection to ssh 43681 1727204712.83523: Set connection var ansible_shell_executable to /bin/sh 43681 1727204712.83526: variable 'ansible_shell_executable' from source: unknown 43681 1727204712.83529: variable 'ansible_connection' from source: unknown 43681 1727204712.83532: variable 'ansible_module_compression' from source: unknown 43681 1727204712.83535: variable 'ansible_shell_type' from source: unknown 43681 1727204712.83537: variable 'ansible_shell_executable' from source: unknown 43681 1727204712.83539: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204712.83541: variable 'ansible_pipelining' from source: unknown 43681 1727204712.83543: variable 'ansible_timeout' from source: unknown 43681 1727204712.83545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204712.83708: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204712.83727: variable 'omit' from source: magic vars 43681 1727204712.83743: starting attempt loop 43681 1727204712.83750: running the handler 43681 1727204712.83771: _low_level_execute_command(): starting 43681 1727204712.83783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204712.84618: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.84664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.84699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.84726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.84793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.86508: stdout chunk (state=3): >>>/root <<< 43681 1727204712.86726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.86730: stdout chunk (state=3): >>><<< 43681 1727204712.86733: stderr chunk (state=3): >>><<< 43681 1727204712.86760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.86874: _low_level_execute_command(): starting 43681 1727204712.86878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079 `" && echo ansible-tmp-1727204712.8676696-44401-206413208538079="` echo /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079 `" ) && sleep 0' 43681 1727204712.87454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.87465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.87478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.87498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.87513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.87559: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204712.87563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.87573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204712.87576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204712.87655: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.87738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.87742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.89744: stdout chunk (state=3): >>>ansible-tmp-1727204712.8676696-44401-206413208538079=/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079 <<< 43681 1727204712.89913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.89960: stderr chunk (state=3): >>><<< 43681 1727204712.89972: stdout chunk (state=3): >>><<< 43681 1727204712.90000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204712.8676696-44401-206413208538079=/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.90164: variable 'ansible_module_compression' from source: unknown 43681 1727204712.90167: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204712.90170: variable 'ansible_facts' from source: unknown 43681 1727204712.90260: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py 43681 1727204712.90532: Sending initial data 43681 1727204712.90542: Sent initial data (156 bytes) 43681 1727204712.91119: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.91209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.91262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.91286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.91309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.91377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.92968: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204712.93071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204712.93078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpxumhxjv7 /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py <<< 43681 1727204712.93082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py" <<< 43681 1727204712.93133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpxumhxjv7" to remote "/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py" <<< 43681 1727204712.94367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.94624: stderr chunk (state=3): >>><<< 43681 1727204712.94627: stdout chunk (state=3): >>><<< 43681 1727204712.94630: done transferring module to remote 43681 1727204712.94632: _low_level_execute_command(): starting 43681 1727204712.94635: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/ /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py && sleep 0' 43681 1727204712.95304: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.95318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.95433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.95453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.95521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204712.97336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204712.97431: stderr chunk (state=3): >>><<< 43681 1727204712.97441: stdout chunk (state=3): >>><<< 43681 1727204712.97470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204712.97481: _low_level_execute_command(): starting 43681 1727204712.97494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/AnsiballZ_command.py && sleep 0' 43681 1727204712.98145: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204712.98161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204712.98174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204712.98197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204712.98220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204712.98232: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204712.98258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.98307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204712.98375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204712.98396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204712.98420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204712.98502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204713.17530: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:05:13.156659", "end": "2024-09-24 15:05:13.174121", "delta": "0:00:00.017462", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204713.19199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204713.19259: stderr chunk (state=3): >>><<< 43681 1727204713.19263: stdout chunk (state=3): >>><<< 43681 1727204713.19281: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:05:13.156659", "end": "2024-09-24 15:05:13.174121", "delta": "0:00:00.017462", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204713.19322: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204713.19329: _low_level_execute_command(): starting 43681 1727204713.19335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204712.8676696-44401-206413208538079/ > /dev/null 2>&1 && sleep 0' 43681 1727204713.19828: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204713.19833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204713.27367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204713.27372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204713.27374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204713.27377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204713.27379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204713.27381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204713.27383: stderr chunk (state=3): >>><<< 43681 1727204713.27385: stdout chunk (state=3): >>><<< 43681 1727204713.27387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204713.27391: handler run complete 43681 1727204713.27394: Evaluated conditional (False): False 43681 1727204713.27396: attempt loop complete, returning result 43681 1727204713.27398: _execute() done 43681 1727204713.27400: dumping result to json 43681 1727204713.27402: done dumping result, returning 43681 1727204713.27404: done running TaskExecutor() for managed-node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" [12b410aa-8751-9e86-7728-000000000061] 43681 1727204713.27406: sending task result for task 12b410aa-8751-9e86-7728-000000000061 43681 1727204713.27487: done sending task result for task 12b410aa-8751-9e86-7728-000000000061 43681 1727204713.27494: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017462", "end": "2024-09-24 15:05:13.174121", "rc": 0, "start": "2024-09-24 15:05:13.156659" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 43681 1727204713.27629: no more pending results, returning what we have 43681 1727204713.27634: results queue empty 43681 1727204713.27635: checking for any_errors_fatal 43681 1727204713.27643: done checking for any_errors_fatal 43681 1727204713.27644: checking for max_fail_percentage 43681 1727204713.27647: done checking for max_fail_percentage 43681 1727204713.27648: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.27649: done checking to see if all hosts have failed 43681 1727204713.27650: getting the remaining hosts for this loop 43681 1727204713.27652: done getting the remaining hosts for this loop 43681 1727204713.27657: getting the next task for host managed-node3 43681 1727204713.27663: done getting next task for host managed-node3 43681 1727204713.27668: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 43681 1727204713.27686: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.27694: getting variables 43681 1727204713.27696: in VariableManager get_vars() 43681 1727204713.27735: Calling all_inventory to load vars for managed-node3 43681 1727204713.27739: Calling groups_inventory to load vars for managed-node3 43681 1727204713.27742: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.27753: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.27757: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.27761: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.29350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.31437: done with get_vars() 43681 1727204713.31463: done getting variables 43681 1727204713.31518: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.502) 0:00:20.982 ***** 43681 1727204713.31542: entering _queue_task() for managed-node3/assert 43681 1727204713.31820: worker is 1 (out of 1 available) 43681 1727204713.31837: exiting _queue_task() for managed-node3/assert 43681 1727204713.31852: done queuing things up, now waiting for results queue to drain 43681 1727204713.31854: waiting for pending results... 43681 1727204713.32052: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 43681 1727204713.32135: in run() - task 12b410aa-8751-9e86-7728-000000000062 43681 1727204713.32149: variable 'ansible_search_path' from source: unknown 43681 1727204713.32181: calling self._execute() 43681 1727204713.32269: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.32276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.32287: variable 'omit' from source: magic vars 43681 1727204713.32614: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.32626: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.32724: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.32731: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204713.32740: variable 'omit' from source: magic vars 43681 1727204713.32760: variable 'omit' from source: magic vars 43681 1727204713.32793: variable 'omit' from source: magic vars 43681 1727204713.32831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.32865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.32884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.32903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.32913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.32942: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.32946: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.32953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.33036: Set connection var ansible_shell_type to sh 43681 1727204713.33043: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.33050: Set connection var ansible_timeout to 10 43681 1727204713.33061: Set connection var ansible_pipelining to False 43681 1727204713.33065: Set connection var ansible_connection to ssh 43681 1727204713.33076: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.33096: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.33100: variable 'ansible_connection' from source: unknown 43681 1727204713.33102: variable 'ansible_module_compression' from source: unknown 43681 1727204713.33107: variable 'ansible_shell_type' from source: unknown 43681 1727204713.33110: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.33114: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.33120: variable 'ansible_pipelining' from source: unknown 43681 1727204713.33122: variable 'ansible_timeout' from source: unknown 43681 1727204713.33128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.33250: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.33260: variable 'omit' from source: magic vars 43681 1727204713.33266: starting attempt loop 43681 1727204713.33270: running the handler 43681 1727204713.33414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.33603: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.33643: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.33700: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.33734: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.33804: variable 'route_rule_table_30200' from source: set_fact 43681 1727204713.33834: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 43681 1727204713.33954: variable 'route_rule_table_30200' from source: set_fact 43681 1727204713.33975: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 43681 1727204713.34086: variable 'route_rule_table_30200' from source: set_fact 43681 1727204713.34109: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 43681 1727204713.34224: variable 'route_rule_table_30200' from source: set_fact 43681 1727204713.34246: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 43681 1727204713.34359: variable 'route_rule_table_30200' from source: set_fact 43681 1727204713.34386: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 43681 1727204713.34399: handler run complete 43681 1727204713.34413: attempt loop complete, returning result 43681 1727204713.34416: _execute() done 43681 1727204713.34422: dumping result to json 43681 1727204713.34425: done dumping result, returning 43681 1727204713.34434: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [12b410aa-8751-9e86-7728-000000000062] 43681 1727204713.34440: sending task result for task 12b410aa-8751-9e86-7728-000000000062 43681 1727204713.34531: done sending task result for task 12b410aa-8751-9e86-7728-000000000062 43681 1727204713.34534: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.34585: no more pending results, returning what we have 43681 1727204713.34591: results queue empty 43681 1727204713.34592: checking for any_errors_fatal 43681 1727204713.34602: done checking for any_errors_fatal 43681 1727204713.34603: checking for max_fail_percentage 43681 1727204713.34605: done checking for max_fail_percentage 43681 1727204713.34606: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.34607: done checking to see if all hosts have failed 43681 1727204713.34608: getting the remaining hosts for this loop 43681 1727204713.34609: done getting the remaining hosts for this loop 43681 1727204713.34614: getting the next task for host managed-node3 43681 1727204713.34621: done getting next task for host managed-node3 43681 1727204713.34624: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 43681 1727204713.34626: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.34629: getting variables 43681 1727204713.34631: in VariableManager get_vars() 43681 1727204713.34670: Calling all_inventory to load vars for managed-node3 43681 1727204713.34673: Calling groups_inventory to load vars for managed-node3 43681 1727204713.34676: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.34695: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.34699: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.34702: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.36103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.37674: done with get_vars() 43681 1727204713.37699: done getting variables 43681 1727204713.37752: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.062) 0:00:21.044 ***** 43681 1727204713.37775: entering _queue_task() for managed-node3/assert 43681 1727204713.38037: worker is 1 (out of 1 available) 43681 1727204713.38051: exiting _queue_task() for managed-node3/assert 43681 1727204713.38065: done queuing things up, now waiting for results queue to drain 43681 1727204713.38067: waiting for pending results... 43681 1727204713.38267: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 43681 1727204713.38347: in run() - task 12b410aa-8751-9e86-7728-000000000063 43681 1727204713.38361: variable 'ansible_search_path' from source: unknown 43681 1727204713.38394: calling self._execute() 43681 1727204713.38479: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.38486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.38497: variable 'omit' from source: magic vars 43681 1727204713.38831: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.38844: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.38941: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.38947: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204713.38957: variable 'omit' from source: magic vars 43681 1727204713.38975: variable 'omit' from source: magic vars 43681 1727204713.39008: variable 'omit' from source: magic vars 43681 1727204713.39047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.39082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.39103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.39119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.39132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.39162: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.39167: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.39170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.39257: Set connection var ansible_shell_type to sh 43681 1727204713.39263: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.39271: Set connection var ansible_timeout to 10 43681 1727204713.39281: Set connection var ansible_pipelining to False 43681 1727204713.39291: Set connection var ansible_connection to ssh 43681 1727204713.39294: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.39316: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.39322: variable 'ansible_connection' from source: unknown 43681 1727204713.39325: variable 'ansible_module_compression' from source: unknown 43681 1727204713.39329: variable 'ansible_shell_type' from source: unknown 43681 1727204713.39332: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.39337: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.39342: variable 'ansible_pipelining' from source: unknown 43681 1727204713.39344: variable 'ansible_timeout' from source: unknown 43681 1727204713.39350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.39472: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.39483: variable 'omit' from source: magic vars 43681 1727204713.39491: starting attempt loop 43681 1727204713.39494: running the handler 43681 1727204713.39645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.39841: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.39877: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.39940: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.39973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.40044: variable 'route_rule_table_30400' from source: set_fact 43681 1727204713.40075: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 43681 1727204713.40195: variable 'route_rule_table_30400' from source: set_fact 43681 1727204713.40221: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 43681 1727204713.40337: variable 'route_rule_table_30400' from source: set_fact 43681 1727204713.40361: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 43681 1727204713.40369: handler run complete 43681 1727204713.40384: attempt loop complete, returning result 43681 1727204713.40396: _execute() done 43681 1727204713.40400: dumping result to json 43681 1727204713.40403: done dumping result, returning 43681 1727204713.40411: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [12b410aa-8751-9e86-7728-000000000063] 43681 1727204713.40417: sending task result for task 12b410aa-8751-9e86-7728-000000000063 43681 1727204713.40511: done sending task result for task 12b410aa-8751-9e86-7728-000000000063 43681 1727204713.40515: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.40570: no more pending results, returning what we have 43681 1727204713.40574: results queue empty 43681 1727204713.40575: checking for any_errors_fatal 43681 1727204713.40593: done checking for any_errors_fatal 43681 1727204713.40594: checking for max_fail_percentage 43681 1727204713.40596: done checking for max_fail_percentage 43681 1727204713.40597: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.40598: done checking to see if all hosts have failed 43681 1727204713.40599: getting the remaining hosts for this loop 43681 1727204713.40600: done getting the remaining hosts for this loop 43681 1727204713.40605: getting the next task for host managed-node3 43681 1727204713.40610: done getting next task for host managed-node3 43681 1727204713.40614: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 43681 1727204713.40616: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.40619: getting variables 43681 1727204713.40621: in VariableManager get_vars() 43681 1727204713.40661: Calling all_inventory to load vars for managed-node3 43681 1727204713.40664: Calling groups_inventory to load vars for managed-node3 43681 1727204713.40666: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.40677: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.40680: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.40684: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.41933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.43615: done with get_vars() 43681 1727204713.43640: done getting variables 43681 1727204713.43688: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.059) 0:00:21.103 ***** 43681 1727204713.43713: entering _queue_task() for managed-node3/assert 43681 1727204713.43957: worker is 1 (out of 1 available) 43681 1727204713.43973: exiting _queue_task() for managed-node3/assert 43681 1727204713.43988: done queuing things up, now waiting for results queue to drain 43681 1727204713.43992: waiting for pending results... 43681 1727204713.44181: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 43681 1727204713.44260: in run() - task 12b410aa-8751-9e86-7728-000000000064 43681 1727204713.44273: variable 'ansible_search_path' from source: unknown 43681 1727204713.44306: calling self._execute() 43681 1727204713.44392: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.44399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.44409: variable 'omit' from source: magic vars 43681 1727204713.44732: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.44743: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.44843: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.44847: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204713.44855: variable 'omit' from source: magic vars 43681 1727204713.44875: variable 'omit' from source: magic vars 43681 1727204713.44910: variable 'omit' from source: magic vars 43681 1727204713.44948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.44983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.45003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.45022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.45033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.45061: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.45064: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.45069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.45156: Set connection var ansible_shell_type to sh 43681 1727204713.45163: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.45170: Set connection var ansible_timeout to 10 43681 1727204713.45178: Set connection var ansible_pipelining to False 43681 1727204713.45185: Set connection var ansible_connection to ssh 43681 1727204713.45192: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.45217: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.45223: variable 'ansible_connection' from source: unknown 43681 1727204713.45226: variable 'ansible_module_compression' from source: unknown 43681 1727204713.45230: variable 'ansible_shell_type' from source: unknown 43681 1727204713.45233: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.45238: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.45243: variable 'ansible_pipelining' from source: unknown 43681 1727204713.45245: variable 'ansible_timeout' from source: unknown 43681 1727204713.45251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.45372: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.45383: variable 'omit' from source: magic vars 43681 1727204713.45390: starting attempt loop 43681 1727204713.45394: running the handler 43681 1727204713.45544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.45739: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.45778: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.45839: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.45872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.45944: variable 'route_rule_table_30600' from source: set_fact 43681 1727204713.45975: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 43681 1727204713.46092: variable 'route_rule_table_30600' from source: set_fact 43681 1727204713.46116: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 43681 1727204713.46125: handler run complete 43681 1727204713.46139: attempt loop complete, returning result 43681 1727204713.46142: _execute() done 43681 1727204713.46146: dumping result to json 43681 1727204713.46151: done dumping result, returning 43681 1727204713.46158: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [12b410aa-8751-9e86-7728-000000000064] 43681 1727204713.46164: sending task result for task 12b410aa-8751-9e86-7728-000000000064 43681 1727204713.46260: done sending task result for task 12b410aa-8751-9e86-7728-000000000064 43681 1727204713.46263: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.46340: no more pending results, returning what we have 43681 1727204713.46344: results queue empty 43681 1727204713.46345: checking for any_errors_fatal 43681 1727204713.46352: done checking for any_errors_fatal 43681 1727204713.46353: checking for max_fail_percentage 43681 1727204713.46355: done checking for max_fail_percentage 43681 1727204713.46355: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.46356: done checking to see if all hosts have failed 43681 1727204713.46357: getting the remaining hosts for this loop 43681 1727204713.46359: done getting the remaining hosts for this loop 43681 1727204713.46363: getting the next task for host managed-node3 43681 1727204713.46369: done getting next task for host managed-node3 43681 1727204713.46372: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 43681 1727204713.46374: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.46378: getting variables 43681 1727204713.46380: in VariableManager get_vars() 43681 1727204713.46416: Calling all_inventory to load vars for managed-node3 43681 1727204713.46419: Calling groups_inventory to load vars for managed-node3 43681 1727204713.46421: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.46432: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.46435: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.46438: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.47656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.49229: done with get_vars() 43681 1727204713.49251: done getting variables 43681 1727204713.49300: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.056) 0:00:21.159 ***** 43681 1727204713.49330: entering _queue_task() for managed-node3/assert 43681 1727204713.49577: worker is 1 (out of 1 available) 43681 1727204713.49594: exiting _queue_task() for managed-node3/assert 43681 1727204713.49609: done queuing things up, now waiting for results queue to drain 43681 1727204713.49611: waiting for pending results... 43681 1727204713.49826: running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 43681 1727204713.49903: in run() - task 12b410aa-8751-9e86-7728-000000000065 43681 1727204713.49920: variable 'ansible_search_path' from source: unknown 43681 1727204713.49954: calling self._execute() 43681 1727204713.50042: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.50053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.50063: variable 'omit' from source: magic vars 43681 1727204713.50386: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.50398: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.50492: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.50498: Evaluated conditional (ansible_distribution_major_version != "7"): True 43681 1727204713.50507: variable 'omit' from source: magic vars 43681 1727204713.50534: variable 'omit' from source: magic vars 43681 1727204713.50566: variable 'omit' from source: magic vars 43681 1727204713.50605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.50643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.50661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.50678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.50691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.50719: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.50726: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.50731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.50814: Set connection var ansible_shell_type to sh 43681 1727204713.50823: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.50832: Set connection var ansible_timeout to 10 43681 1727204713.50842: Set connection var ansible_pipelining to False 43681 1727204713.50848: Set connection var ansible_connection to ssh 43681 1727204713.50856: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.50875: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.50879: variable 'ansible_connection' from source: unknown 43681 1727204713.50881: variable 'ansible_module_compression' from source: unknown 43681 1727204713.50885: variable 'ansible_shell_type' from source: unknown 43681 1727204713.50888: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.50895: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.50900: variable 'ansible_pipelining' from source: unknown 43681 1727204713.50903: variable 'ansible_timeout' from source: unknown 43681 1727204713.50908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.51031: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.51042: variable 'omit' from source: magic vars 43681 1727204713.51048: starting attempt loop 43681 1727204713.51053: running the handler 43681 1727204713.51199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.51394: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.51694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.51698: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.51700: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.51703: variable 'route_rule_table_custom' from source: set_fact 43681 1727204713.51718: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 43681 1727204713.51732: handler run complete 43681 1727204713.51756: attempt loop complete, returning result 43681 1727204713.51765: _execute() done 43681 1727204713.51773: dumping result to json 43681 1727204713.51781: done dumping result, returning 43681 1727204713.51797: done running TaskExecutor() for managed-node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [12b410aa-8751-9e86-7728-000000000065] 43681 1727204713.51810: sending task result for task 12b410aa-8751-9e86-7728-000000000065 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.51981: no more pending results, returning what we have 43681 1727204713.51985: results queue empty 43681 1727204713.51986: checking for any_errors_fatal 43681 1727204713.52029: done checking for any_errors_fatal 43681 1727204713.52030: checking for max_fail_percentage 43681 1727204713.52033: done checking for max_fail_percentage 43681 1727204713.52034: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.52035: done checking to see if all hosts have failed 43681 1727204713.52035: getting the remaining hosts for this loop 43681 1727204713.52037: done getting the remaining hosts for this loop 43681 1727204713.52041: getting the next task for host managed-node3 43681 1727204713.52048: done getting next task for host managed-node3 43681 1727204713.52052: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 43681 1727204713.52054: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.52057: getting variables 43681 1727204713.52059: in VariableManager get_vars() 43681 1727204713.52100: Calling all_inventory to load vars for managed-node3 43681 1727204713.52103: Calling groups_inventory to load vars for managed-node3 43681 1727204713.52106: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.52119: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.52297: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.52309: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.52830: done sending task result for task 12b410aa-8751-9e86-7728-000000000065 43681 1727204713.52834: WORKER PROCESS EXITING 43681 1727204713.53959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.56198: done with get_vars() 43681 1727204713.56238: done getting variables 43681 1727204713.56308: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204713.56454: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.071) 0:00:21.231 ***** 43681 1727204713.56496: entering _queue_task() for managed-node3/assert 43681 1727204713.56843: worker is 1 (out of 1 available) 43681 1727204713.56860: exiting _queue_task() for managed-node3/assert 43681 1727204713.56874: done queuing things up, now waiting for results queue to drain 43681 1727204713.56876: waiting for pending results... 43681 1727204713.57308: running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 43681 1727204713.57314: in run() - task 12b410aa-8751-9e86-7728-000000000066 43681 1727204713.57336: variable 'ansible_search_path' from source: unknown 43681 1727204713.57381: calling self._execute() 43681 1727204713.57502: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.57518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.57538: variable 'omit' from source: magic vars 43681 1727204713.58002: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.58025: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.58037: variable 'omit' from source: magic vars 43681 1727204713.58064: variable 'omit' from source: magic vars 43681 1727204713.58294: variable 'interface' from source: set_fact 43681 1727204713.58298: variable 'omit' from source: magic vars 43681 1727204713.58302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.58334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.58363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.58391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.58412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.58456: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.58466: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.58531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.58610: Set connection var ansible_shell_type to sh 43681 1727204713.58626: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.58643: Set connection var ansible_timeout to 10 43681 1727204713.58658: Set connection var ansible_pipelining to False 43681 1727204713.58669: Set connection var ansible_connection to ssh 43681 1727204713.58680: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.58712: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.58723: variable 'ansible_connection' from source: unknown 43681 1727204713.58731: variable 'ansible_module_compression' from source: unknown 43681 1727204713.58738: variable 'ansible_shell_type' from source: unknown 43681 1727204713.58750: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.58757: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.58856: variable 'ansible_pipelining' from source: unknown 43681 1727204713.58859: variable 'ansible_timeout' from source: unknown 43681 1727204713.58861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.58967: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.58986: variable 'omit' from source: magic vars 43681 1727204713.59000: starting attempt loop 43681 1727204713.59007: running the handler 43681 1727204713.59236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.59541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.59601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.59688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.59742: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.59853: variable 'connection_route_rule' from source: set_fact 43681 1727204713.59892: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 43681 1727204713.60086: variable 'connection_route_rule' from source: set_fact 43681 1727204713.60125: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 43681 1727204713.60379: variable 'connection_route_rule' from source: set_fact 43681 1727204713.60382: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 43681 1727204713.60533: variable 'connection_route_rule' from source: set_fact 43681 1727204713.60565: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 43681 1727204713.60750: variable 'connection_route_rule' from source: set_fact 43681 1727204713.60783: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 43681 1727204713.60972: variable 'connection_route_rule' from source: set_fact 43681 1727204713.61006: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 43681 1727204713.61193: variable 'connection_route_rule' from source: set_fact 43681 1727204713.61230: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 43681 1727204713.61413: variable 'connection_route_rule' from source: set_fact 43681 1727204713.61450: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 43681 1727204713.61795: variable 'connection_route_rule' from source: set_fact 43681 1727204713.61798: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 43681 1727204713.61813: variable 'connection_route_rule' from source: set_fact 43681 1727204713.61843: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 43681 1727204713.61855: handler run complete 43681 1727204713.61876: attempt loop complete, returning result 43681 1727204713.61883: _execute() done 43681 1727204713.61892: dumping result to json 43681 1727204713.61899: done dumping result, returning 43681 1727204713.61918: done running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [12b410aa-8751-9e86-7728-000000000066] 43681 1727204713.61928: sending task result for task 12b410aa-8751-9e86-7728-000000000066 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.62158: no more pending results, returning what we have 43681 1727204713.62163: results queue empty 43681 1727204713.62164: checking for any_errors_fatal 43681 1727204713.62173: done checking for any_errors_fatal 43681 1727204713.62174: checking for max_fail_percentage 43681 1727204713.62177: done checking for max_fail_percentage 43681 1727204713.62177: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.62179: done checking to see if all hosts have failed 43681 1727204713.62179: getting the remaining hosts for this loop 43681 1727204713.62181: done getting the remaining hosts for this loop 43681 1727204713.62196: getting the next task for host managed-node3 43681 1727204713.62204: done getting next task for host managed-node3 43681 1727204713.62208: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 43681 1727204713.62211: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.62215: getting variables 43681 1727204713.62220: in VariableManager get_vars() 43681 1727204713.62270: Calling all_inventory to load vars for managed-node3 43681 1727204713.62274: Calling groups_inventory to load vars for managed-node3 43681 1727204713.62277: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.62496: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.62502: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.62508: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.63207: done sending task result for task 12b410aa-8751-9e86-7728-000000000066 43681 1727204713.63211: WORKER PROCESS EXITING 43681 1727204713.65355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.71728: done with get_vars() 43681 1727204713.71765: done getting variables 43681 1727204713.72044: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204713.72376: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.159) 0:00:21.390 ***** 43681 1727204713.72414: entering _queue_task() for managed-node3/assert 43681 1727204713.72947: worker is 1 (out of 1 available) 43681 1727204713.72964: exiting _queue_task() for managed-node3/assert 43681 1727204713.72979: done queuing things up, now waiting for results queue to drain 43681 1727204713.72981: waiting for pending results... 43681 1727204713.73253: running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 43681 1727204713.73387: in run() - task 12b410aa-8751-9e86-7728-000000000067 43681 1727204713.73412: variable 'ansible_search_path' from source: unknown 43681 1727204713.73460: calling self._execute() 43681 1727204713.73580: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.73601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.73618: variable 'omit' from source: magic vars 43681 1727204713.74099: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.74118: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.74130: variable 'omit' from source: magic vars 43681 1727204713.74166: variable 'omit' from source: magic vars 43681 1727204713.74313: variable 'interface' from source: set_fact 43681 1727204713.74341: variable 'omit' from source: magic vars 43681 1727204713.74402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.74451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.74484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.74520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.74538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.74582: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.74593: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.74602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.74740: Set connection var ansible_shell_type to sh 43681 1727204713.74755: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.74767: Set connection var ansible_timeout to 10 43681 1727204713.74796: Set connection var ansible_pipelining to False 43681 1727204713.74800: Set connection var ansible_connection to ssh 43681 1727204713.74810: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.74894: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.74897: variable 'ansible_connection' from source: unknown 43681 1727204713.74902: variable 'ansible_module_compression' from source: unknown 43681 1727204713.74904: variable 'ansible_shell_type' from source: unknown 43681 1727204713.74906: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.74908: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.74910: variable 'ansible_pipelining' from source: unknown 43681 1727204713.74912: variable 'ansible_timeout' from source: unknown 43681 1727204713.74915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.75075: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204713.75139: variable 'omit' from source: magic vars 43681 1727204713.75143: starting attempt loop 43681 1727204713.75145: running the handler 43681 1727204713.75340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204713.75645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204713.75711: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204713.75803: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204713.75912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204713.75962: variable 'connection_route_rule6' from source: set_fact 43681 1727204713.76001: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 43681 1727204713.76283: variable 'connection_route_rule6' from source: set_fact 43681 1727204713.76321: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 43681 1727204713.76514: variable 'connection_route_rule6' from source: set_fact 43681 1727204713.76546: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 43681 1727204713.76562: handler run complete 43681 1727204713.76671: attempt loop complete, returning result 43681 1727204713.76675: _execute() done 43681 1727204713.76679: dumping result to json 43681 1727204713.76681: done dumping result, returning 43681 1727204713.76684: done running TaskExecutor() for managed-node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [12b410aa-8751-9e86-7728-000000000067] 43681 1727204713.76686: sending task result for task 12b410aa-8751-9e86-7728-000000000067 43681 1727204713.76759: done sending task result for task 12b410aa-8751-9e86-7728-000000000067 43681 1727204713.76763: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204713.76830: no more pending results, returning what we have 43681 1727204713.76835: results queue empty 43681 1727204713.76836: checking for any_errors_fatal 43681 1727204713.76847: done checking for any_errors_fatal 43681 1727204713.76848: checking for max_fail_percentage 43681 1727204713.76850: done checking for max_fail_percentage 43681 1727204713.76851: checking to see if all hosts have failed and the running result is not ok 43681 1727204713.76852: done checking to see if all hosts have failed 43681 1727204713.76853: getting the remaining hosts for this loop 43681 1727204713.76855: done getting the remaining hosts for this loop 43681 1727204713.76860: getting the next task for host managed-node3 43681 1727204713.76866: done getting next task for host managed-node3 43681 1727204713.76870: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 43681 1727204713.76984: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204713.76990: getting variables 43681 1727204713.76995: in VariableManager get_vars() 43681 1727204713.77040: Calling all_inventory to load vars for managed-node3 43681 1727204713.77043: Calling groups_inventory to load vars for managed-node3 43681 1727204713.77046: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204713.77059: Calling all_plugins_play to load vars for managed-node3 43681 1727204713.77062: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204713.77066: Calling groups_plugins_play to load vars for managed-node3 43681 1727204713.79544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204713.82682: done with get_vars() 43681 1727204713.82732: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.104) 0:00:21.495 ***** 43681 1727204713.82860: entering _queue_task() for managed-node3/file 43681 1727204713.83496: worker is 1 (out of 1 available) 43681 1727204713.83507: exiting _queue_task() for managed-node3/file 43681 1727204713.83519: done queuing things up, now waiting for results queue to drain 43681 1727204713.83521: waiting for pending results... 43681 1727204713.83713: running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 43681 1727204713.83735: in run() - task 12b410aa-8751-9e86-7728-000000000068 43681 1727204713.83762: variable 'ansible_search_path' from source: unknown 43681 1727204713.83826: calling self._execute() 43681 1727204713.83939: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.83962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.84043: variable 'omit' from source: magic vars 43681 1727204713.84441: variable 'ansible_distribution_major_version' from source: facts 43681 1727204713.84460: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204713.84477: variable 'omit' from source: magic vars 43681 1727204713.84511: variable 'omit' from source: magic vars 43681 1727204713.84561: variable 'omit' from source: magic vars 43681 1727204713.84623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204713.84669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204713.84703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204713.84734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.84753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204713.84807: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204713.84811: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.84829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.84956: Set connection var ansible_shell_type to sh 43681 1727204713.84970: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204713.85025: Set connection var ansible_timeout to 10 43681 1727204713.85028: Set connection var ansible_pipelining to False 43681 1727204713.85031: Set connection var ansible_connection to ssh 43681 1727204713.85033: Set connection var ansible_shell_executable to /bin/sh 43681 1727204713.85058: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.85066: variable 'ansible_connection' from source: unknown 43681 1727204713.85073: variable 'ansible_module_compression' from source: unknown 43681 1727204713.85080: variable 'ansible_shell_type' from source: unknown 43681 1727204713.85087: variable 'ansible_shell_executable' from source: unknown 43681 1727204713.85098: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204713.85106: variable 'ansible_pipelining' from source: unknown 43681 1727204713.85133: variable 'ansible_timeout' from source: unknown 43681 1727204713.85136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204713.85383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204713.85461: variable 'omit' from source: magic vars 43681 1727204713.85465: starting attempt loop 43681 1727204713.85467: running the handler 43681 1727204713.85470: _low_level_execute_command(): starting 43681 1727204713.85472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204713.86235: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204713.86311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204713.86383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204713.86403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204713.86429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204713.86516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204713.88297: stdout chunk (state=3): >>>/root <<< 43681 1727204713.88515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204713.88519: stdout chunk (state=3): >>><<< 43681 1727204713.88522: stderr chunk (state=3): >>><<< 43681 1727204713.88546: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204713.88569: _low_level_execute_command(): starting 43681 1727204713.88582: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306 `" && echo ansible-tmp-1727204713.8855476-44433-245477318976306="` echo /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306 `" ) && sleep 0' 43681 1727204713.89211: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204713.89228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204713.89252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204713.89271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204713.89304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204713.89363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204713.89422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204713.89444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204713.89504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204713.89550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204713.91577: stdout chunk (state=3): >>>ansible-tmp-1727204713.8855476-44433-245477318976306=/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306 <<< 43681 1727204713.91809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204713.91813: stdout chunk (state=3): >>><<< 43681 1727204713.91816: stderr chunk (state=3): >>><<< 43681 1727204713.91819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204713.8855476-44433-245477318976306=/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204713.91860: variable 'ansible_module_compression' from source: unknown 43681 1727204713.91928: ANSIBALLZ: Using lock for file 43681 1727204713.91931: ANSIBALLZ: Acquiring lock 43681 1727204713.91934: ANSIBALLZ: Lock acquired: 140156139377472 43681 1727204713.91939: ANSIBALLZ: Creating module 43681 1727204714.10100: ANSIBALLZ: Writing module into payload 43681 1727204714.10258: ANSIBALLZ: Writing module 43681 1727204714.10276: ANSIBALLZ: Renaming module 43681 1727204714.10283: ANSIBALLZ: Done creating module 43681 1727204714.10301: variable 'ansible_facts' from source: unknown 43681 1727204714.10562: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py 43681 1727204714.10651: Sending initial data 43681 1727204714.10655: Sent initial data (153 bytes) 43681 1727204714.11008: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.11012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.11015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204714.11017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204714.11020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.11070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.11073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.11125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.12873: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204714.12908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204714.12940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpav494vt_ /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py <<< 43681 1727204714.12949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py" <<< 43681 1727204714.12976: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpav494vt_" to remote "/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py" <<< 43681 1727204714.13763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.13843: stderr chunk (state=3): >>><<< 43681 1727204714.13846: stdout chunk (state=3): >>><<< 43681 1727204714.13869: done transferring module to remote 43681 1727204714.13881: _low_level_execute_command(): starting 43681 1727204714.13886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/ /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py && sleep 0' 43681 1727204714.14370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.14373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204714.14378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204714.14381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204714.14387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.14442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.14449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204714.14451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.14484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.16405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.16459: stderr chunk (state=3): >>><<< 43681 1727204714.16463: stdout chunk (state=3): >>><<< 43681 1727204714.16479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204714.16482: _low_level_execute_command(): starting 43681 1727204714.16488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/AnsiballZ_file.py && sleep 0' 43681 1727204714.16963: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.16967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204714.16969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204714.16971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.16974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.17020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.17038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.17079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.35275: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 43681 1727204714.36803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204714.36868: stderr chunk (state=3): >>><<< 43681 1727204714.36872: stdout chunk (state=3): >>><<< 43681 1727204714.36888: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204714.36933: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204714.36943: _low_level_execute_command(): starting 43681 1727204714.36948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204713.8855476-44433-245477318976306/ > /dev/null 2>&1 && sleep 0' 43681 1727204714.37434: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.37442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204714.37445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204714.37448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.37450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.37495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204714.37509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.37549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.39611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.39615: stdout chunk (state=3): >>><<< 43681 1727204714.39617: stderr chunk (state=3): >>><<< 43681 1727204714.39620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204714.39622: handler run complete 43681 1727204714.39722: attempt loop complete, returning result 43681 1727204714.39726: _execute() done 43681 1727204714.39728: dumping result to json 43681 1727204714.39730: done dumping result, returning 43681 1727204714.39733: done running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [12b410aa-8751-9e86-7728-000000000068] 43681 1727204714.39735: sending task result for task 12b410aa-8751-9e86-7728-000000000068 43681 1727204714.39812: done sending task result for task 12b410aa-8751-9e86-7728-000000000068 43681 1727204714.39816: WORKER PROCESS EXITING changed: [managed-node3] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 43681 1727204714.39947: no more pending results, returning what we have 43681 1727204714.39952: results queue empty 43681 1727204714.39953: checking for any_errors_fatal 43681 1727204714.39961: done checking for any_errors_fatal 43681 1727204714.39962: checking for max_fail_percentage 43681 1727204714.39964: done checking for max_fail_percentage 43681 1727204714.39964: checking to see if all hosts have failed and the running result is not ok 43681 1727204714.39965: done checking to see if all hosts have failed 43681 1727204714.39966: getting the remaining hosts for this loop 43681 1727204714.39968: done getting the remaining hosts for this loop 43681 1727204714.39974: getting the next task for host managed-node3 43681 1727204714.39981: done getting next task for host managed-node3 43681 1727204714.39984: ^ task is: TASK: meta (flush_handlers) 43681 1727204714.39986: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204714.40197: getting variables 43681 1727204714.40200: in VariableManager get_vars() 43681 1727204714.40243: Calling all_inventory to load vars for managed-node3 43681 1727204714.40247: Calling groups_inventory to load vars for managed-node3 43681 1727204714.40250: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204714.40270: Calling all_plugins_play to load vars for managed-node3 43681 1727204714.40274: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204714.40280: Calling groups_plugins_play to load vars for managed-node3 43681 1727204714.42765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204714.44909: done with get_vars() 43681 1727204714.44937: done getting variables 43681 1727204714.44999: in VariableManager get_vars() 43681 1727204714.45011: Calling all_inventory to load vars for managed-node3 43681 1727204714.45012: Calling groups_inventory to load vars for managed-node3 43681 1727204714.45014: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204714.45021: Calling all_plugins_play to load vars for managed-node3 43681 1727204714.45023: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204714.45025: Calling groups_plugins_play to load vars for managed-node3 43681 1727204714.46256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204714.48192: done with get_vars() 43681 1727204714.48220: done queuing things up, now waiting for results queue to drain 43681 1727204714.48222: results queue empty 43681 1727204714.48222: checking for any_errors_fatal 43681 1727204714.48226: done checking for any_errors_fatal 43681 1727204714.48226: checking for max_fail_percentage 43681 1727204714.48227: done checking for max_fail_percentage 43681 1727204714.48228: checking to see if all hosts have failed and the running result is not ok 43681 1727204714.48228: done checking to see if all hosts have failed 43681 1727204714.48229: getting the remaining hosts for this loop 43681 1727204714.48230: done getting the remaining hosts for this loop 43681 1727204714.48232: getting the next task for host managed-node3 43681 1727204714.48235: done getting next task for host managed-node3 43681 1727204714.48236: ^ task is: TASK: meta (flush_handlers) 43681 1727204714.48237: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204714.48240: getting variables 43681 1727204714.48241: in VariableManager get_vars() 43681 1727204714.48250: Calling all_inventory to load vars for managed-node3 43681 1727204714.48251: Calling groups_inventory to load vars for managed-node3 43681 1727204714.48253: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204714.48257: Calling all_plugins_play to load vars for managed-node3 43681 1727204714.48259: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204714.48261: Calling groups_plugins_play to load vars for managed-node3 43681 1727204714.49326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204714.50856: done with get_vars() 43681 1727204714.50875: done getting variables 43681 1727204714.50924: in VariableManager get_vars() 43681 1727204714.50935: Calling all_inventory to load vars for managed-node3 43681 1727204714.50936: Calling groups_inventory to load vars for managed-node3 43681 1727204714.50938: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204714.50942: Calling all_plugins_play to load vars for managed-node3 43681 1727204714.50944: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204714.50946: Calling groups_plugins_play to load vars for managed-node3 43681 1727204714.55794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204714.57322: done with get_vars() 43681 1727204714.57348: done queuing things up, now waiting for results queue to drain 43681 1727204714.57350: results queue empty 43681 1727204714.57351: checking for any_errors_fatal 43681 1727204714.57352: done checking for any_errors_fatal 43681 1727204714.57352: checking for max_fail_percentage 43681 1727204714.57353: done checking for max_fail_percentage 43681 1727204714.57354: checking to see if all hosts have failed and the running result is not ok 43681 1727204714.57354: done checking to see if all hosts have failed 43681 1727204714.57355: getting the remaining hosts for this loop 43681 1727204714.57356: done getting the remaining hosts for this loop 43681 1727204714.57358: getting the next task for host managed-node3 43681 1727204714.57360: done getting next task for host managed-node3 43681 1727204714.57361: ^ task is: None 43681 1727204714.57362: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204714.57363: done queuing things up, now waiting for results queue to drain 43681 1727204714.57364: results queue empty 43681 1727204714.57365: checking for any_errors_fatal 43681 1727204714.57365: done checking for any_errors_fatal 43681 1727204714.57366: checking for max_fail_percentage 43681 1727204714.57367: done checking for max_fail_percentage 43681 1727204714.57367: checking to see if all hosts have failed and the running result is not ok 43681 1727204714.57368: done checking to see if all hosts have failed 43681 1727204714.57370: getting the next task for host managed-node3 43681 1727204714.57372: done getting next task for host managed-node3 43681 1727204714.57373: ^ task is: None 43681 1727204714.57374: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204714.57410: in VariableManager get_vars() 43681 1727204714.57426: done with get_vars() 43681 1727204714.57430: in VariableManager get_vars() 43681 1727204714.57439: done with get_vars() 43681 1727204714.57442: variable 'omit' from source: magic vars 43681 1727204714.57521: variable 'profile' from source: play vars 43681 1727204714.57607: in VariableManager get_vars() 43681 1727204714.57620: done with get_vars() 43681 1727204714.57635: variable 'omit' from source: magic vars 43681 1727204714.57686: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 43681 1727204714.58285: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 43681 1727204714.58307: getting the remaining hosts for this loop 43681 1727204714.58308: done getting the remaining hosts for this loop 43681 1727204714.58310: getting the next task for host managed-node3 43681 1727204714.58312: done getting next task for host managed-node3 43681 1727204714.58314: ^ task is: TASK: Gathering Facts 43681 1727204714.58315: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204714.58317: getting variables 43681 1727204714.58318: in VariableManager get_vars() 43681 1727204714.58327: Calling all_inventory to load vars for managed-node3 43681 1727204714.58328: Calling groups_inventory to load vars for managed-node3 43681 1727204714.58330: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204714.58335: Calling all_plugins_play to load vars for managed-node3 43681 1727204714.58336: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204714.58339: Calling groups_plugins_play to load vars for managed-node3 43681 1727204714.59374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204714.60988: done with get_vars() 43681 1727204714.61010: done getting variables 43681 1727204714.61050: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.782) 0:00:22.277 ***** 43681 1727204714.61070: entering _queue_task() for managed-node3/gather_facts 43681 1727204714.61340: worker is 1 (out of 1 available) 43681 1727204714.61356: exiting _queue_task() for managed-node3/gather_facts 43681 1727204714.61370: done queuing things up, now waiting for results queue to drain 43681 1727204714.61372: waiting for pending results... 43681 1727204714.61559: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204714.61630: in run() - task 12b410aa-8751-9e86-7728-0000000004b1 43681 1727204714.61643: variable 'ansible_search_path' from source: unknown 43681 1727204714.61675: calling self._execute() 43681 1727204714.61764: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204714.61770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204714.61780: variable 'omit' from source: magic vars 43681 1727204714.62111: variable 'ansible_distribution_major_version' from source: facts 43681 1727204714.62125: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204714.62131: variable 'omit' from source: magic vars 43681 1727204714.62156: variable 'omit' from source: magic vars 43681 1727204714.62187: variable 'omit' from source: magic vars 43681 1727204714.62226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204714.62260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204714.62282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204714.62301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204714.62313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204714.62343: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204714.62347: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204714.62350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204714.62439: Set connection var ansible_shell_type to sh 43681 1727204714.62445: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204714.62452: Set connection var ansible_timeout to 10 43681 1727204714.62461: Set connection var ansible_pipelining to False 43681 1727204714.62467: Set connection var ansible_connection to ssh 43681 1727204714.62477: Set connection var ansible_shell_executable to /bin/sh 43681 1727204714.62498: variable 'ansible_shell_executable' from source: unknown 43681 1727204714.62502: variable 'ansible_connection' from source: unknown 43681 1727204714.62505: variable 'ansible_module_compression' from source: unknown 43681 1727204714.62507: variable 'ansible_shell_type' from source: unknown 43681 1727204714.62512: variable 'ansible_shell_executable' from source: unknown 43681 1727204714.62518: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204714.62521: variable 'ansible_pipelining' from source: unknown 43681 1727204714.62524: variable 'ansible_timeout' from source: unknown 43681 1727204714.62530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204714.62682: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204714.62694: variable 'omit' from source: magic vars 43681 1727204714.62701: starting attempt loop 43681 1727204714.62704: running the handler 43681 1727204714.62721: variable 'ansible_facts' from source: unknown 43681 1727204714.62737: _low_level_execute_command(): starting 43681 1727204714.62744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204714.63302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.63306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.63310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204714.63314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.63368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.63372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.63421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.65182: stdout chunk (state=3): >>>/root <<< 43681 1727204714.65283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.65339: stderr chunk (state=3): >>><<< 43681 1727204714.65342: stdout chunk (state=3): >>><<< 43681 1727204714.65365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204714.65378: _low_level_execute_command(): starting 43681 1727204714.65384: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283 `" && echo ansible-tmp-1727204714.6536565-44524-255188215099283="` echo /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283 `" ) && sleep 0' 43681 1727204714.65834: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204714.65839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.65842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204714.65852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.65855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.65902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.65907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.65946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.67925: stdout chunk (state=3): >>>ansible-tmp-1727204714.6536565-44524-255188215099283=/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283 <<< 43681 1727204714.68045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.68087: stderr chunk (state=3): >>><<< 43681 1727204714.68093: stdout chunk (state=3): >>><<< 43681 1727204714.68109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204714.6536565-44524-255188215099283=/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204714.68140: variable 'ansible_module_compression' from source: unknown 43681 1727204714.68180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204714.68239: variable 'ansible_facts' from source: unknown 43681 1727204714.68359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py 43681 1727204714.68478: Sending initial data 43681 1727204714.68482: Sent initial data (154 bytes) 43681 1727204714.68926: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204714.68930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204714.68933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204714.68935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204714.68937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.68984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.68994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.69028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.70637: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204714.70664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204714.70698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp4yhmc3l4 /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py <<< 43681 1727204714.70711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py" <<< 43681 1727204714.70734: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp4yhmc3l4" to remote "/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py" <<< 43681 1727204714.72356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.72415: stderr chunk (state=3): >>><<< 43681 1727204714.72422: stdout chunk (state=3): >>><<< 43681 1727204714.72441: done transferring module to remote 43681 1727204714.72451: _low_level_execute_command(): starting 43681 1727204714.72457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/ /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py && sleep 0' 43681 1727204714.73083: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.73120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204714.75034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204714.75152: stderr chunk (state=3): >>><<< 43681 1727204714.75169: stdout chunk (state=3): >>><<< 43681 1727204714.75281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204714.75285: _low_level_execute_command(): starting 43681 1727204714.75295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/AnsiballZ_setup.py && sleep 0' 43681 1727204714.75896: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204714.75918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204714.75936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204714.75967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204714.76012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204714.76079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204714.76130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204714.76151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204714.76180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204714.76265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204715.48761: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "15", "epoch": "1727204715", "epoch_int": "1727204715", "date": "2024-09-24", "time": "15:05:15", "iso8601_micro": "2024-09-24T19:05:15.075347Z", "iso8601": "2024-09-24T19:05:15Z", "iso8601_basic": "20240924T150515075347", "iso8601_basic_short": "20240924T150515", "t<<< 43681 1727204715.48768: stdout chunk (state=3): >>>z": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "peerethtest0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::94e4:32e8:aef8:7c74", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "2001:db8::2", "fe80::94e4:32e8:aef8:7c74", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::37d3:4e93:30d:de94", "fe80::94e4:32e8:aef8:7c74", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2852, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 865, "free": 2852}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1219, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148201984, "block_size": 4096, "block_total": 64479564, "block_available": 61315479, "block_used": 3164085, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.8115234375, "5m": 0.85498046875, "15m": 0.546875}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204715.50742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204715.50809: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204715.50872: stderr chunk (state=3): >>><<< 43681 1727204715.50885: stdout chunk (state=3): >>><<< 43681 1727204715.51052: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "15", "epoch": "1727204715", "epoch_int": "1727204715", "date": "2024-09-24", "time": "15:05:15", "iso8601_micro": "2024-09-24T19:05:15.075347Z", "iso8601": "2024-09-24T19:05:15Z", "iso8601_basic": "20240924T150515075347", "iso8601_basic_short": "20240924T150515", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "peerethtest0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::94e4:32e8:aef8:7c74", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "2001:db8::2", "fe80::94e4:32e8:aef8:7c74", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::37d3:4e93:30d:de94", "fe80::94e4:32e8:aef8:7c74", "fe80::b08f:9ff:fe74:fbc0"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2852, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 865, "free": 2852}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1219, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251148201984, "block_size": 4096, "block_total": 64479564, "block_available": 61315479, "block_used": 3164085, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.8115234375, "5m": 0.85498046875, "15m": 0.546875}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204715.52696: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204715.52699: _low_level_execute_command(): starting 43681 1727204715.52702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204714.6536565-44524-255188215099283/ > /dev/null 2>&1 && sleep 0' 43681 1727204715.53881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204715.54109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204715.54146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204715.54154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204715.54157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204715.54208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204715.56156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204715.56230: stderr chunk (state=3): >>><<< 43681 1727204715.56240: stdout chunk (state=3): >>><<< 43681 1727204715.56264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204715.56280: handler run complete 43681 1727204715.56744: variable 'ansible_facts' from source: unknown 43681 1727204715.57194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204715.58419: variable 'ansible_facts' from source: unknown 43681 1727204715.58743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204715.59209: attempt loop complete, returning result 43681 1727204715.59494: _execute() done 43681 1727204715.59498: dumping result to json 43681 1727204715.59501: done dumping result, returning 43681 1727204715.59503: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-0000000004b1] 43681 1727204715.59506: sending task result for task 12b410aa-8751-9e86-7728-0000000004b1 43681 1727204715.60938: done sending task result for task 12b410aa-8751-9e86-7728-0000000004b1 43681 1727204715.60942: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204715.62069: no more pending results, returning what we have 43681 1727204715.62074: results queue empty 43681 1727204715.62075: checking for any_errors_fatal 43681 1727204715.62076: done checking for any_errors_fatal 43681 1727204715.62077: checking for max_fail_percentage 43681 1727204715.62079: done checking for max_fail_percentage 43681 1727204715.62080: checking to see if all hosts have failed and the running result is not ok 43681 1727204715.62081: done checking to see if all hosts have failed 43681 1727204715.62082: getting the remaining hosts for this loop 43681 1727204715.62084: done getting the remaining hosts for this loop 43681 1727204715.62088: getting the next task for host managed-node3 43681 1727204715.62145: done getting next task for host managed-node3 43681 1727204715.62148: ^ task is: TASK: meta (flush_handlers) 43681 1727204715.62150: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204715.62155: getting variables 43681 1727204715.62156: in VariableManager get_vars() 43681 1727204715.62188: Calling all_inventory to load vars for managed-node3 43681 1727204715.62192: Calling groups_inventory to load vars for managed-node3 43681 1727204715.62196: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204715.62207: Calling all_plugins_play to load vars for managed-node3 43681 1727204715.62211: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204715.62214: Calling groups_plugins_play to load vars for managed-node3 43681 1727204715.65615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204715.69813: done with get_vars() 43681 1727204715.69856: done getting variables 43681 1727204715.69957: in VariableManager get_vars() 43681 1727204715.69973: Calling all_inventory to load vars for managed-node3 43681 1727204715.69976: Calling groups_inventory to load vars for managed-node3 43681 1727204715.69979: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204715.69985: Calling all_plugins_play to load vars for managed-node3 43681 1727204715.69988: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204715.69995: Calling groups_plugins_play to load vars for managed-node3 43681 1727204715.73063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204715.77859: done with get_vars() 43681 1727204715.78047: done queuing things up, now waiting for results queue to drain 43681 1727204715.78050: results queue empty 43681 1727204715.78051: checking for any_errors_fatal 43681 1727204715.78057: done checking for any_errors_fatal 43681 1727204715.78058: checking for max_fail_percentage 43681 1727204715.78059: done checking for max_fail_percentage 43681 1727204715.78060: checking to see if all hosts have failed and the running result is not ok 43681 1727204715.78061: done checking to see if all hosts have failed 43681 1727204715.78067: getting the remaining hosts for this loop 43681 1727204715.78068: done getting the remaining hosts for this loop 43681 1727204715.78072: getting the next task for host managed-node3 43681 1727204715.78077: done getting next task for host managed-node3 43681 1727204715.78081: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204715.78083: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204715.78098: getting variables 43681 1727204715.78099: in VariableManager get_vars() 43681 1727204715.78120: Calling all_inventory to load vars for managed-node3 43681 1727204715.78123: Calling groups_inventory to load vars for managed-node3 43681 1727204715.78126: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204715.78251: Calling all_plugins_play to load vars for managed-node3 43681 1727204715.78255: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204715.78260: Calling groups_plugins_play to load vars for managed-node3 43681 1727204715.83028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204715.86842: done with get_vars() 43681 1727204715.86882: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:15 -0400 (0:00:01.259) 0:00:23.536 ***** 43681 1727204715.86997: entering _queue_task() for managed-node3/include_tasks 43681 1727204715.87424: worker is 1 (out of 1 available) 43681 1727204715.87440: exiting _queue_task() for managed-node3/include_tasks 43681 1727204715.87571: done queuing things up, now waiting for results queue to drain 43681 1727204715.87573: waiting for pending results... 43681 1727204715.87811: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204715.87968: in run() - task 12b410aa-8751-9e86-7728-000000000071 43681 1727204715.87996: variable 'ansible_search_path' from source: unknown 43681 1727204715.88010: variable 'ansible_search_path' from source: unknown 43681 1727204715.88065: calling self._execute() 43681 1727204715.88200: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204715.88220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204715.88243: variable 'omit' from source: magic vars 43681 1727204715.88755: variable 'ansible_distribution_major_version' from source: facts 43681 1727204715.88809: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204715.88813: _execute() done 43681 1727204715.88818: dumping result to json 43681 1727204715.88821: done dumping result, returning 43681 1727204715.88826: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9e86-7728-000000000071] 43681 1727204715.88884: sending task result for task 12b410aa-8751-9e86-7728-000000000071 43681 1727204715.89074: no more pending results, returning what we have 43681 1727204715.89082: in VariableManager get_vars() 43681 1727204715.89167: Calling all_inventory to load vars for managed-node3 43681 1727204715.89170: Calling groups_inventory to load vars for managed-node3 43681 1727204715.89173: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204715.89191: Calling all_plugins_play to load vars for managed-node3 43681 1727204715.89196: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204715.89201: Calling groups_plugins_play to load vars for managed-node3 43681 1727204715.90106: done sending task result for task 12b410aa-8751-9e86-7728-000000000071 43681 1727204715.90110: WORKER PROCESS EXITING 43681 1727204715.93170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204716.00335: done with get_vars() 43681 1727204716.00374: variable 'ansible_search_path' from source: unknown 43681 1727204716.00376: variable 'ansible_search_path' from source: unknown 43681 1727204716.00414: we have included files to process 43681 1727204716.00416: generating all_blocks data 43681 1727204716.00418: done generating all_blocks data 43681 1727204716.00419: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204716.00420: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204716.00423: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204716.02076: done processing included file 43681 1727204716.02078: iterating over new_blocks loaded from include file 43681 1727204716.02080: in VariableManager get_vars() 43681 1727204716.02111: done with get_vars() 43681 1727204716.02113: filtering new block on tags 43681 1727204716.02134: done filtering new block on tags 43681 1727204716.02138: in VariableManager get_vars() 43681 1727204716.02163: done with get_vars() 43681 1727204716.02166: filtering new block on tags 43681 1727204716.02307: done filtering new block on tags 43681 1727204716.02310: in VariableManager get_vars() 43681 1727204716.02337: done with get_vars() 43681 1727204716.02339: filtering new block on tags 43681 1727204716.02361: done filtering new block on tags 43681 1727204716.02364: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 43681 1727204716.02370: extending task lists for all hosts with included blocks 43681 1727204716.03565: done extending task lists 43681 1727204716.03567: done processing included files 43681 1727204716.03568: results queue empty 43681 1727204716.03569: checking for any_errors_fatal 43681 1727204716.03571: done checking for any_errors_fatal 43681 1727204716.03572: checking for max_fail_percentage 43681 1727204716.03573: done checking for max_fail_percentage 43681 1727204716.03574: checking to see if all hosts have failed and the running result is not ok 43681 1727204716.03575: done checking to see if all hosts have failed 43681 1727204716.03576: getting the remaining hosts for this loop 43681 1727204716.03578: done getting the remaining hosts for this loop 43681 1727204716.03581: getting the next task for host managed-node3 43681 1727204716.03585: done getting next task for host managed-node3 43681 1727204716.03588: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204716.03593: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204716.03718: getting variables 43681 1727204716.03720: in VariableManager get_vars() 43681 1727204716.03738: Calling all_inventory to load vars for managed-node3 43681 1727204716.03741: Calling groups_inventory to load vars for managed-node3 43681 1727204716.03744: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204716.03751: Calling all_plugins_play to load vars for managed-node3 43681 1727204716.03754: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204716.03758: Calling groups_plugins_play to load vars for managed-node3 43681 1727204716.08533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204716.15578: done with get_vars() 43681 1727204716.15635: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.287) 0:00:23.824 ***** 43681 1727204716.15889: entering _queue_task() for managed-node3/setup 43681 1727204716.16687: worker is 1 (out of 1 available) 43681 1727204716.16702: exiting _queue_task() for managed-node3/setup 43681 1727204716.16714: done queuing things up, now waiting for results queue to drain 43681 1727204716.16718: waiting for pending results... 43681 1727204716.17159: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204716.17453: in run() - task 12b410aa-8751-9e86-7728-0000000004f2 43681 1727204716.17471: variable 'ansible_search_path' from source: unknown 43681 1727204716.17475: variable 'ansible_search_path' from source: unknown 43681 1727204716.17523: calling self._execute() 43681 1727204716.17863: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.17867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.17877: variable 'omit' from source: magic vars 43681 1727204716.18861: variable 'ansible_distribution_major_version' from source: facts 43681 1727204716.18874: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204716.19434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204716.25195: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204716.25388: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204716.25494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204716.25533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204716.25564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204716.25775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204716.25932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204716.26042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204716.26122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204716.26138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204716.26207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204716.26353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204716.26382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204716.26490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204716.26503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204716.26938: variable '__network_required_facts' from source: role '' defaults 43681 1727204716.26949: variable 'ansible_facts' from source: unknown 43681 1727204716.29426: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 43681 1727204716.29435: when evaluation is False, skipping this task 43681 1727204716.29439: _execute() done 43681 1727204716.29442: dumping result to json 43681 1727204716.29444: done dumping result, returning 43681 1727204716.29447: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9e86-7728-0000000004f2] 43681 1727204716.29495: sending task result for task 12b410aa-8751-9e86-7728-0000000004f2 43681 1727204716.29677: done sending task result for task 12b410aa-8751-9e86-7728-0000000004f2 43681 1727204716.29680: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204716.29842: no more pending results, returning what we have 43681 1727204716.29847: results queue empty 43681 1727204716.29848: checking for any_errors_fatal 43681 1727204716.29850: done checking for any_errors_fatal 43681 1727204716.29851: checking for max_fail_percentage 43681 1727204716.29853: done checking for max_fail_percentage 43681 1727204716.29854: checking to see if all hosts have failed and the running result is not ok 43681 1727204716.29855: done checking to see if all hosts have failed 43681 1727204716.29856: getting the remaining hosts for this loop 43681 1727204716.29857: done getting the remaining hosts for this loop 43681 1727204716.29862: getting the next task for host managed-node3 43681 1727204716.29871: done getting next task for host managed-node3 43681 1727204716.29876: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204716.29879: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204716.29897: getting variables 43681 1727204716.29899: in VariableManager get_vars() 43681 1727204716.29946: Calling all_inventory to load vars for managed-node3 43681 1727204716.29950: Calling groups_inventory to load vars for managed-node3 43681 1727204716.29953: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204716.29965: Calling all_plugins_play to load vars for managed-node3 43681 1727204716.29969: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204716.29973: Calling groups_plugins_play to load vars for managed-node3 43681 1727204716.35280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204716.40200: done with get_vars() 43681 1727204716.40275: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.246) 0:00:24.070 ***** 43681 1727204716.40410: entering _queue_task() for managed-node3/stat 43681 1727204716.41015: worker is 1 (out of 1 available) 43681 1727204716.41026: exiting _queue_task() for managed-node3/stat 43681 1727204716.41039: done queuing things up, now waiting for results queue to drain 43681 1727204716.41041: waiting for pending results... 43681 1727204716.41310: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204716.41381: in run() - task 12b410aa-8751-9e86-7728-0000000004f4 43681 1727204716.41386: variable 'ansible_search_path' from source: unknown 43681 1727204716.41390: variable 'ansible_search_path' from source: unknown 43681 1727204716.41437: calling self._execute() 43681 1727204716.41636: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.41644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.41697: variable 'omit' from source: magic vars 43681 1727204716.42180: variable 'ansible_distribution_major_version' from source: facts 43681 1727204716.42195: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204716.42425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204716.42815: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204716.42831: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204716.42872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204716.42924: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204716.43034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204716.43064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204716.43097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204716.43141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204716.43259: variable '__network_is_ostree' from source: set_fact 43681 1727204716.43267: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204716.43270: when evaluation is False, skipping this task 43681 1727204716.43273: _execute() done 43681 1727204716.43278: dumping result to json 43681 1727204716.43283: done dumping result, returning 43681 1727204716.43293: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9e86-7728-0000000004f4] 43681 1727204716.43302: sending task result for task 12b410aa-8751-9e86-7728-0000000004f4 43681 1727204716.43403: done sending task result for task 12b410aa-8751-9e86-7728-0000000004f4 43681 1727204716.43407: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204716.43478: no more pending results, returning what we have 43681 1727204716.43483: results queue empty 43681 1727204716.43485: checking for any_errors_fatal 43681 1727204716.43494: done checking for any_errors_fatal 43681 1727204716.43495: checking for max_fail_percentage 43681 1727204716.43498: done checking for max_fail_percentage 43681 1727204716.43499: checking to see if all hosts have failed and the running result is not ok 43681 1727204716.43500: done checking to see if all hosts have failed 43681 1727204716.43501: getting the remaining hosts for this loop 43681 1727204716.43503: done getting the remaining hosts for this loop 43681 1727204716.43508: getting the next task for host managed-node3 43681 1727204716.43515: done getting next task for host managed-node3 43681 1727204716.43519: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204716.43523: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204716.43540: getting variables 43681 1727204716.43542: in VariableManager get_vars() 43681 1727204716.43824: Calling all_inventory to load vars for managed-node3 43681 1727204716.43827: Calling groups_inventory to load vars for managed-node3 43681 1727204716.43831: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204716.43841: Calling all_plugins_play to load vars for managed-node3 43681 1727204716.43845: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204716.43849: Calling groups_plugins_play to load vars for managed-node3 43681 1727204716.46546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204716.51517: done with get_vars() 43681 1727204716.51561: done getting variables 43681 1727204716.51636: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.112) 0:00:24.183 ***** 43681 1727204716.51680: entering _queue_task() for managed-node3/set_fact 43681 1727204716.52455: worker is 1 (out of 1 available) 43681 1727204716.52470: exiting _queue_task() for managed-node3/set_fact 43681 1727204716.52485: done queuing things up, now waiting for results queue to drain 43681 1727204716.52486: waiting for pending results... 43681 1727204716.53070: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204716.53311: in run() - task 12b410aa-8751-9e86-7728-0000000004f5 43681 1727204716.53349: variable 'ansible_search_path' from source: unknown 43681 1727204716.53353: variable 'ansible_search_path' from source: unknown 43681 1727204716.53420: calling self._execute() 43681 1727204716.53571: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.53575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.53594: variable 'omit' from source: magic vars 43681 1727204716.54363: variable 'ansible_distribution_major_version' from source: facts 43681 1727204716.54472: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204716.54931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204716.55708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204716.55830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204716.55948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204716.56033: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204716.56155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204716.56194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204716.56245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204716.56283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204716.56408: variable '__network_is_ostree' from source: set_fact 43681 1727204716.56425: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204716.56433: when evaluation is False, skipping this task 43681 1727204716.56491: _execute() done 43681 1727204716.56500: dumping result to json 43681 1727204716.56504: done dumping result, returning 43681 1727204716.56507: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9e86-7728-0000000004f5] 43681 1727204716.56509: sending task result for task 12b410aa-8751-9e86-7728-0000000004f5 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204716.56764: no more pending results, returning what we have 43681 1727204716.56770: results queue empty 43681 1727204716.56771: checking for any_errors_fatal 43681 1727204716.56780: done checking for any_errors_fatal 43681 1727204716.56781: checking for max_fail_percentage 43681 1727204716.56783: done checking for max_fail_percentage 43681 1727204716.56784: checking to see if all hosts have failed and the running result is not ok 43681 1727204716.56785: done checking to see if all hosts have failed 43681 1727204716.56786: getting the remaining hosts for this loop 43681 1727204716.56788: done getting the remaining hosts for this loop 43681 1727204716.56794: getting the next task for host managed-node3 43681 1727204716.56804: done getting next task for host managed-node3 43681 1727204716.56808: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204716.56812: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204716.56827: done sending task result for task 12b410aa-8751-9e86-7728-0000000004f5 43681 1727204716.56831: WORKER PROCESS EXITING 43681 1727204716.56897: getting variables 43681 1727204716.56899: in VariableManager get_vars() 43681 1727204716.56936: Calling all_inventory to load vars for managed-node3 43681 1727204716.56939: Calling groups_inventory to load vars for managed-node3 43681 1727204716.56942: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204716.56952: Calling all_plugins_play to load vars for managed-node3 43681 1727204716.56955: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204716.56959: Calling groups_plugins_play to load vars for managed-node3 43681 1727204716.59908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204716.63437: done with get_vars() 43681 1727204716.63486: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.120) 0:00:24.303 ***** 43681 1727204716.63686: entering _queue_task() for managed-node3/service_facts 43681 1727204716.64249: worker is 1 (out of 1 available) 43681 1727204716.64286: exiting _queue_task() for managed-node3/service_facts 43681 1727204716.64348: done queuing things up, now waiting for results queue to drain 43681 1727204716.64351: waiting for pending results... 43681 1727204716.64850: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204716.65294: in run() - task 12b410aa-8751-9e86-7728-0000000004f7 43681 1727204716.65386: variable 'ansible_search_path' from source: unknown 43681 1727204716.65412: variable 'ansible_search_path' from source: unknown 43681 1727204716.65417: calling self._execute() 43681 1727204716.65712: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.65717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.65720: variable 'omit' from source: magic vars 43681 1727204716.66566: variable 'ansible_distribution_major_version' from source: facts 43681 1727204716.66578: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204716.66654: variable 'omit' from source: magic vars 43681 1727204716.66743: variable 'omit' from source: magic vars 43681 1727204716.66842: variable 'omit' from source: magic vars 43681 1727204716.66914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204716.66989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204716.67031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204716.67058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204716.67098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204716.67130: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204716.67159: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.67229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.67356: Set connection var ansible_shell_type to sh 43681 1727204716.67378: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204716.67406: Set connection var ansible_timeout to 10 43681 1727204716.67426: Set connection var ansible_pipelining to False 43681 1727204716.67447: Set connection var ansible_connection to ssh 43681 1727204716.67595: Set connection var ansible_shell_executable to /bin/sh 43681 1727204716.67599: variable 'ansible_shell_executable' from source: unknown 43681 1727204716.67601: variable 'ansible_connection' from source: unknown 43681 1727204716.67605: variable 'ansible_module_compression' from source: unknown 43681 1727204716.67608: variable 'ansible_shell_type' from source: unknown 43681 1727204716.67610: variable 'ansible_shell_executable' from source: unknown 43681 1727204716.67613: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204716.67618: variable 'ansible_pipelining' from source: unknown 43681 1727204716.67621: variable 'ansible_timeout' from source: unknown 43681 1727204716.67624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204716.68300: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204716.68305: variable 'omit' from source: magic vars 43681 1727204716.68395: starting attempt loop 43681 1727204716.68399: running the handler 43681 1727204716.68402: _low_level_execute_command(): starting 43681 1727204716.68405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204716.69098: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204716.69111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204716.69123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204716.69190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.69248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204716.69271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204716.69325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204716.69370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204716.71255: stdout chunk (state=3): >>>/root <<< 43681 1727204716.71427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204716.71431: stdout chunk (state=3): >>><<< 43681 1727204716.71433: stderr chunk (state=3): >>><<< 43681 1727204716.71437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204716.71440: _low_level_execute_command(): starting 43681 1727204716.71444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135 `" && echo ansible-tmp-1727204716.7141807-44625-224591713213135="` echo /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135 `" ) && sleep 0' 43681 1727204716.72297: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204716.72301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204716.72304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204716.72308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204716.72346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204716.72456: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204716.72463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.72467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204716.72469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204716.72473: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204716.72476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204716.72478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204716.72480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204716.72483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204716.72485: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204716.72558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.72578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204716.72606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204716.72662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204716.72680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204716.74677: stdout chunk (state=3): >>>ansible-tmp-1727204716.7141807-44625-224591713213135=/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135 <<< 43681 1727204716.74886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204716.74891: stdout chunk (state=3): >>><<< 43681 1727204716.74894: stderr chunk (state=3): >>><<< 43681 1727204716.75044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204716.7141807-44625-224591713213135=/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204716.75048: variable 'ansible_module_compression' from source: unknown 43681 1727204716.75050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 43681 1727204716.75120: variable 'ansible_facts' from source: unknown 43681 1727204716.75231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py 43681 1727204716.75411: Sending initial data 43681 1727204716.75510: Sent initial data (162 bytes) 43681 1727204716.77009: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.77041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204716.77210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204716.77306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204716.77531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204716.79148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204716.79166: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204716.79203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204716.79246: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpv68vnff8 /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py <<< 43681 1727204716.79272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py" <<< 43681 1727204716.79313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpv68vnff8" to remote "/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py" <<< 43681 1727204716.81437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204716.81524: stderr chunk (state=3): >>><<< 43681 1727204716.81528: stdout chunk (state=3): >>><<< 43681 1727204716.81553: done transferring module to remote 43681 1727204716.81568: _low_level_execute_command(): starting 43681 1727204716.81572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/ /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py && sleep 0' 43681 1727204716.82065: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204716.82098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204716.82101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204716.82136: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204716.82155: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.82181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204716.82207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204716.82229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204716.84162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204716.84219: stderr chunk (state=3): >>><<< 43681 1727204716.84222: stdout chunk (state=3): >>><<< 43681 1727204716.84297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204716.84301: _low_level_execute_command(): starting 43681 1727204716.84305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/AnsiballZ_service_facts.py && sleep 0' 43681 1727204716.84813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204716.84817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204716.84820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204716.84897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204716.84933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204718.81367: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service<<< 43681 1727204718.81403: stdout chunk (state=3): >>>", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 43681 1727204718.81435: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "sourc<<< 43681 1727204718.81466: stdout chunk (state=3): >>>e": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "statu<<< 43681 1727204718.81474: stdout chunk (state=3): >>>s": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 43681 1727204718.83232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204718.83235: stderr chunk (state=3): >>><<< 43681 1727204718.83238: stdout chunk (state=3): >>><<< 43681 1727204718.83264: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204718.84223: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204718.84233: _low_level_execute_command(): starting 43681 1727204718.84239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204716.7141807-44625-224591713213135/ > /dev/null 2>&1 && sleep 0' 43681 1727204718.84737: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204718.84741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204718.84744: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.84746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204718.84751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.84825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204718.84864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204718.87098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204718.87103: stdout chunk (state=3): >>><<< 43681 1727204718.87105: stderr chunk (state=3): >>><<< 43681 1727204718.87108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204718.87111: handler run complete 43681 1727204718.87424: variable 'ansible_facts' from source: unknown 43681 1727204718.87659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204718.88462: variable 'ansible_facts' from source: unknown 43681 1727204718.88676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204718.89042: attempt loop complete, returning result 43681 1727204718.89061: _execute() done 43681 1727204718.89071: dumping result to json 43681 1727204718.89158: done dumping result, returning 43681 1727204718.89176: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9e86-7728-0000000004f7] 43681 1727204718.89194: sending task result for task 12b410aa-8751-9e86-7728-0000000004f7 43681 1727204718.90711: done sending task result for task 12b410aa-8751-9e86-7728-0000000004f7 43681 1727204718.90714: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204718.90770: no more pending results, returning what we have 43681 1727204718.90775: results queue empty 43681 1727204718.90777: checking for any_errors_fatal 43681 1727204718.90781: done checking for any_errors_fatal 43681 1727204718.90782: checking for max_fail_percentage 43681 1727204718.90784: done checking for max_fail_percentage 43681 1727204718.90785: checking to see if all hosts have failed and the running result is not ok 43681 1727204718.90786: done checking to see if all hosts have failed 43681 1727204718.90787: getting the remaining hosts for this loop 43681 1727204718.90788: done getting the remaining hosts for this loop 43681 1727204718.90795: getting the next task for host managed-node3 43681 1727204718.90801: done getting next task for host managed-node3 43681 1727204718.90806: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204718.90808: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204718.90817: getting variables 43681 1727204718.90819: in VariableManager get_vars() 43681 1727204718.90855: Calling all_inventory to load vars for managed-node3 43681 1727204718.90859: Calling groups_inventory to load vars for managed-node3 43681 1727204718.90862: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204718.90874: Calling all_plugins_play to load vars for managed-node3 43681 1727204718.90880: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204718.90885: Calling groups_plugins_play to load vars for managed-node3 43681 1727204718.92383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204718.94339: done with get_vars() 43681 1727204718.94366: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:18 -0400 (0:00:02.307) 0:00:26.611 ***** 43681 1727204718.94452: entering _queue_task() for managed-node3/package_facts 43681 1727204718.94723: worker is 1 (out of 1 available) 43681 1727204718.94739: exiting _queue_task() for managed-node3/package_facts 43681 1727204718.94754: done queuing things up, now waiting for results queue to drain 43681 1727204718.94756: waiting for pending results... 43681 1727204718.94956: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204718.95077: in run() - task 12b410aa-8751-9e86-7728-0000000004f8 43681 1727204718.95093: variable 'ansible_search_path' from source: unknown 43681 1727204718.95097: variable 'ansible_search_path' from source: unknown 43681 1727204718.95138: calling self._execute() 43681 1727204718.95237: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204718.95244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204718.95253: variable 'omit' from source: magic vars 43681 1727204718.95588: variable 'ansible_distribution_major_version' from source: facts 43681 1727204718.95600: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204718.95606: variable 'omit' from source: magic vars 43681 1727204718.95659: variable 'omit' from source: magic vars 43681 1727204718.95690: variable 'omit' from source: magic vars 43681 1727204718.95730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204718.95768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204718.95783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204718.95802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204718.95812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204718.95844: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204718.95847: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204718.95851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204718.95941: Set connection var ansible_shell_type to sh 43681 1727204718.95948: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204718.95956: Set connection var ansible_timeout to 10 43681 1727204718.95965: Set connection var ansible_pipelining to False 43681 1727204718.95971: Set connection var ansible_connection to ssh 43681 1727204718.95985: Set connection var ansible_shell_executable to /bin/sh 43681 1727204718.96005: variable 'ansible_shell_executable' from source: unknown 43681 1727204718.96009: variable 'ansible_connection' from source: unknown 43681 1727204718.96012: variable 'ansible_module_compression' from source: unknown 43681 1727204718.96015: variable 'ansible_shell_type' from source: unknown 43681 1727204718.96022: variable 'ansible_shell_executable' from source: unknown 43681 1727204718.96024: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204718.96030: variable 'ansible_pipelining' from source: unknown 43681 1727204718.96033: variable 'ansible_timeout' from source: unknown 43681 1727204718.96039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204718.96268: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204718.96274: variable 'omit' from source: magic vars 43681 1727204718.96296: starting attempt loop 43681 1727204718.96301: running the handler 43681 1727204718.96304: _low_level_execute_command(): starting 43681 1727204718.96331: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204718.97086: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204718.97092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.97095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204718.97097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.97160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204718.97167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204718.97169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204718.97213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204718.98964: stdout chunk (state=3): >>>/root <<< 43681 1727204718.99071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204718.99129: stderr chunk (state=3): >>><<< 43681 1727204718.99132: stdout chunk (state=3): >>><<< 43681 1727204718.99154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204718.99167: _low_level_execute_command(): starting 43681 1727204718.99173: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746 `" && echo ansible-tmp-1727204718.9915433-44695-20005393523746="` echo /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746 `" ) && sleep 0' 43681 1727204718.99669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204718.99673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.99676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204718.99686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204718.99721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204718.99740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204718.99788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204719.01811: stdout chunk (state=3): >>>ansible-tmp-1727204718.9915433-44695-20005393523746=/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746 <<< 43681 1727204719.01935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204719.01986: stderr chunk (state=3): >>><<< 43681 1727204719.01992: stdout chunk (state=3): >>><<< 43681 1727204719.02007: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204718.9915433-44695-20005393523746=/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204719.02091: variable 'ansible_module_compression' from source: unknown 43681 1727204719.02132: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 43681 1727204719.02206: variable 'ansible_facts' from source: unknown 43681 1727204719.02382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py 43681 1727204719.02546: Sending initial data 43681 1727204719.02550: Sent initial data (161 bytes) 43681 1727204719.03172: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204719.03176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204719.03247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204719.03292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204719.04962: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204719.04990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204719.05033: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp8evwtlku /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py <<< 43681 1727204719.05069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py" <<< 43681 1727204719.05099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp8evwtlku" to remote "/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py" <<< 43681 1727204719.06956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204719.07035: stderr chunk (state=3): >>><<< 43681 1727204719.07039: stdout chunk (state=3): >>><<< 43681 1727204719.07060: done transferring module to remote 43681 1727204719.07074: _low_level_execute_command(): starting 43681 1727204719.07077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/ /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py && sleep 0' 43681 1727204719.08021: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204719.08149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204719.08153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204719.08197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204719.08232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204719.10309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204719.10313: stdout chunk (state=3): >>><<< 43681 1727204719.10322: stderr chunk (state=3): >>><<< 43681 1727204719.10331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204719.10336: _low_level_execute_command(): starting 43681 1727204719.10343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/AnsiballZ_package_facts.py && sleep 0' 43681 1727204719.11119: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204719.11127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204719.11159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204719.11180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204719.11199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204719.11271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204719.11278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204719.11285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204719.11345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204719.76120: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 43681 1727204719.76277: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 43681 1727204719.76300: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 43681 1727204719.76360: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 43681 1727204719.78318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204719.78342: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204719.78469: stderr chunk (state=3): >>><<< 43681 1727204719.78472: stdout chunk (state=3): >>><<< 43681 1727204719.78702: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204719.82967: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204719.83019: _low_level_execute_command(): starting 43681 1727204719.83034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204718.9915433-44695-20005393523746/ > /dev/null 2>&1 && sleep 0' 43681 1727204719.83862: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204719.83881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204719.83901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204719.83935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204719.83987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204719.84006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204719.84085: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204719.84129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204719.84163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204719.84238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204719.86325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204719.86329: stdout chunk (state=3): >>><<< 43681 1727204719.86332: stderr chunk (state=3): >>><<< 43681 1727204719.86353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204719.86495: handler run complete 43681 1727204719.87613: variable 'ansible_facts' from source: unknown 43681 1727204719.88078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204719.90858: variable 'ansible_facts' from source: unknown 43681 1727204719.91650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204719.93262: attempt loop complete, returning result 43681 1727204719.93390: _execute() done 43681 1727204719.93404: dumping result to json 43681 1727204719.93696: done dumping result, returning 43681 1727204719.93727: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9e86-7728-0000000004f8] 43681 1727204719.93745: sending task result for task 12b410aa-8751-9e86-7728-0000000004f8 43681 1727204719.96530: done sending task result for task 12b410aa-8751-9e86-7728-0000000004f8 43681 1727204719.96534: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204719.96630: no more pending results, returning what we have 43681 1727204719.96633: results queue empty 43681 1727204719.96634: checking for any_errors_fatal 43681 1727204719.96637: done checking for any_errors_fatal 43681 1727204719.96637: checking for max_fail_percentage 43681 1727204719.96639: done checking for max_fail_percentage 43681 1727204719.96639: checking to see if all hosts have failed and the running result is not ok 43681 1727204719.96640: done checking to see if all hosts have failed 43681 1727204719.96641: getting the remaining hosts for this loop 43681 1727204719.96641: done getting the remaining hosts for this loop 43681 1727204719.96646: getting the next task for host managed-node3 43681 1727204719.96652: done getting next task for host managed-node3 43681 1727204719.96655: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204719.96657: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204719.96665: getting variables 43681 1727204719.96666: in VariableManager get_vars() 43681 1727204719.96705: Calling all_inventory to load vars for managed-node3 43681 1727204719.96709: Calling groups_inventory to load vars for managed-node3 43681 1727204719.96711: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204719.96721: Calling all_plugins_play to load vars for managed-node3 43681 1727204719.96725: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204719.96728: Calling groups_plugins_play to load vars for managed-node3 43681 1727204719.98632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.00446: done with get_vars() 43681 1727204720.00472: done getting variables 43681 1727204720.00528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:20 -0400 (0:00:01.060) 0:00:27.672 ***** 43681 1727204720.00554: entering _queue_task() for managed-node3/debug 43681 1727204720.00831: worker is 1 (out of 1 available) 43681 1727204720.00848: exiting _queue_task() for managed-node3/debug 43681 1727204720.00862: done queuing things up, now waiting for results queue to drain 43681 1727204720.00864: waiting for pending results... 43681 1727204720.01062: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204720.01147: in run() - task 12b410aa-8751-9e86-7728-000000000072 43681 1727204720.01161: variable 'ansible_search_path' from source: unknown 43681 1727204720.01165: variable 'ansible_search_path' from source: unknown 43681 1727204720.01202: calling self._execute() 43681 1727204720.01292: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.01298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.01309: variable 'omit' from source: magic vars 43681 1727204720.01643: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.01696: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.01700: variable 'omit' from source: magic vars 43681 1727204720.01703: variable 'omit' from source: magic vars 43681 1727204720.01784: variable 'network_provider' from source: set_fact 43681 1727204720.01801: variable 'omit' from source: magic vars 43681 1727204720.01840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204720.01876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204720.01897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204720.01913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204720.01927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204720.01959: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204720.01962: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.01965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.02052: Set connection var ansible_shell_type to sh 43681 1727204720.02058: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204720.02067: Set connection var ansible_timeout to 10 43681 1727204720.02075: Set connection var ansible_pipelining to False 43681 1727204720.02088: Set connection var ansible_connection to ssh 43681 1727204720.02093: Set connection var ansible_shell_executable to /bin/sh 43681 1727204720.02111: variable 'ansible_shell_executable' from source: unknown 43681 1727204720.02114: variable 'ansible_connection' from source: unknown 43681 1727204720.02117: variable 'ansible_module_compression' from source: unknown 43681 1727204720.02123: variable 'ansible_shell_type' from source: unknown 43681 1727204720.02126: variable 'ansible_shell_executable' from source: unknown 43681 1727204720.02130: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.02135: variable 'ansible_pipelining' from source: unknown 43681 1727204720.02139: variable 'ansible_timeout' from source: unknown 43681 1727204720.02144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.02266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204720.02277: variable 'omit' from source: magic vars 43681 1727204720.02284: starting attempt loop 43681 1727204720.02288: running the handler 43681 1727204720.02336: handler run complete 43681 1727204720.02352: attempt loop complete, returning result 43681 1727204720.02355: _execute() done 43681 1727204720.02358: dumping result to json 43681 1727204720.02363: done dumping result, returning 43681 1727204720.02371: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9e86-7728-000000000072] 43681 1727204720.02377: sending task result for task 12b410aa-8751-9e86-7728-000000000072 43681 1727204720.02468: done sending task result for task 12b410aa-8751-9e86-7728-000000000072 43681 1727204720.02471: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 43681 1727204720.02566: no more pending results, returning what we have 43681 1727204720.02571: results queue empty 43681 1727204720.02572: checking for any_errors_fatal 43681 1727204720.02584: done checking for any_errors_fatal 43681 1727204720.02585: checking for max_fail_percentage 43681 1727204720.02587: done checking for max_fail_percentage 43681 1727204720.02588: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.02589: done checking to see if all hosts have failed 43681 1727204720.02589: getting the remaining hosts for this loop 43681 1727204720.02598: done getting the remaining hosts for this loop 43681 1727204720.02604: getting the next task for host managed-node3 43681 1727204720.02610: done getting next task for host managed-node3 43681 1727204720.02614: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204720.02617: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.02627: getting variables 43681 1727204720.02629: in VariableManager get_vars() 43681 1727204720.02665: Calling all_inventory to load vars for managed-node3 43681 1727204720.02668: Calling groups_inventory to load vars for managed-node3 43681 1727204720.02670: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.02679: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.02682: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.02686: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.03949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.05599: done with get_vars() 43681 1727204720.05634: done getting variables 43681 1727204720.05691: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.051) 0:00:27.723 ***** 43681 1727204720.05719: entering _queue_task() for managed-node3/fail 43681 1727204720.06002: worker is 1 (out of 1 available) 43681 1727204720.06019: exiting _queue_task() for managed-node3/fail 43681 1727204720.06033: done queuing things up, now waiting for results queue to drain 43681 1727204720.06035: waiting for pending results... 43681 1727204720.06250: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204720.06333: in run() - task 12b410aa-8751-9e86-7728-000000000073 43681 1727204720.06346: variable 'ansible_search_path' from source: unknown 43681 1727204720.06350: variable 'ansible_search_path' from source: unknown 43681 1727204720.06386: calling self._execute() 43681 1727204720.06476: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.06480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.06498: variable 'omit' from source: magic vars 43681 1727204720.06823: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.06833: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.06941: variable 'network_state' from source: role '' defaults 43681 1727204720.06949: Evaluated conditional (network_state != {}): False 43681 1727204720.06953: when evaluation is False, skipping this task 43681 1727204720.06956: _execute() done 43681 1727204720.06960: dumping result to json 43681 1727204720.06965: done dumping result, returning 43681 1727204720.06973: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9e86-7728-000000000073] 43681 1727204720.06979: sending task result for task 12b410aa-8751-9e86-7728-000000000073 43681 1727204720.07076: done sending task result for task 12b410aa-8751-9e86-7728-000000000073 43681 1727204720.07079: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204720.07134: no more pending results, returning what we have 43681 1727204720.07139: results queue empty 43681 1727204720.07140: checking for any_errors_fatal 43681 1727204720.07150: done checking for any_errors_fatal 43681 1727204720.07151: checking for max_fail_percentage 43681 1727204720.07153: done checking for max_fail_percentage 43681 1727204720.07154: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.07155: done checking to see if all hosts have failed 43681 1727204720.07156: getting the remaining hosts for this loop 43681 1727204720.07157: done getting the remaining hosts for this loop 43681 1727204720.07162: getting the next task for host managed-node3 43681 1727204720.07168: done getting next task for host managed-node3 43681 1727204720.07172: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204720.07175: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.07193: getting variables 43681 1727204720.07195: in VariableManager get_vars() 43681 1727204720.07233: Calling all_inventory to load vars for managed-node3 43681 1727204720.07237: Calling groups_inventory to load vars for managed-node3 43681 1727204720.07240: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.07249: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.07252: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.07256: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.12309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.14311: done with get_vars() 43681 1727204720.14369: done getting variables 43681 1727204720.14440: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.087) 0:00:27.811 ***** 43681 1727204720.14474: entering _queue_task() for managed-node3/fail 43681 1727204720.14893: worker is 1 (out of 1 available) 43681 1727204720.14908: exiting _queue_task() for managed-node3/fail 43681 1727204720.14925: done queuing things up, now waiting for results queue to drain 43681 1727204720.14928: waiting for pending results... 43681 1727204720.15335: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204720.15427: in run() - task 12b410aa-8751-9e86-7728-000000000074 43681 1727204720.15431: variable 'ansible_search_path' from source: unknown 43681 1727204720.15495: variable 'ansible_search_path' from source: unknown 43681 1727204720.15501: calling self._execute() 43681 1727204720.15624: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.15648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.15671: variable 'omit' from source: magic vars 43681 1727204720.16172: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.16201: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.16369: variable 'network_state' from source: role '' defaults 43681 1727204720.16407: Evaluated conditional (network_state != {}): False 43681 1727204720.16411: when evaluation is False, skipping this task 43681 1727204720.16414: _execute() done 43681 1727204720.16495: dumping result to json 43681 1727204720.16498: done dumping result, returning 43681 1727204720.16502: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9e86-7728-000000000074] 43681 1727204720.16505: sending task result for task 12b410aa-8751-9e86-7728-000000000074 43681 1727204720.16592: done sending task result for task 12b410aa-8751-9e86-7728-000000000074 43681 1727204720.16596: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204720.16661: no more pending results, returning what we have 43681 1727204720.16667: results queue empty 43681 1727204720.16668: checking for any_errors_fatal 43681 1727204720.16681: done checking for any_errors_fatal 43681 1727204720.16682: checking for max_fail_percentage 43681 1727204720.16685: done checking for max_fail_percentage 43681 1727204720.16686: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.16687: done checking to see if all hosts have failed 43681 1727204720.16688: getting the remaining hosts for this loop 43681 1727204720.16692: done getting the remaining hosts for this loop 43681 1727204720.16698: getting the next task for host managed-node3 43681 1727204720.16706: done getting next task for host managed-node3 43681 1727204720.16711: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204720.16714: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.16735: getting variables 43681 1727204720.16738: in VariableManager get_vars() 43681 1727204720.16786: Calling all_inventory to load vars for managed-node3 43681 1727204720.16993: Calling groups_inventory to load vars for managed-node3 43681 1727204720.16997: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.17012: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.17019: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.17023: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.18884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.20683: done with get_vars() 43681 1727204720.20723: done getting variables 43681 1727204720.20792: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.063) 0:00:27.874 ***** 43681 1727204720.20825: entering _queue_task() for managed-node3/fail 43681 1727204720.21177: worker is 1 (out of 1 available) 43681 1727204720.21194: exiting _queue_task() for managed-node3/fail 43681 1727204720.21206: done queuing things up, now waiting for results queue to drain 43681 1727204720.21208: waiting for pending results... 43681 1727204720.21487: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204720.21578: in run() - task 12b410aa-8751-9e86-7728-000000000075 43681 1727204720.21591: variable 'ansible_search_path' from source: unknown 43681 1727204720.21596: variable 'ansible_search_path' from source: unknown 43681 1727204720.21630: calling self._execute() 43681 1727204720.21726: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.21732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.21742: variable 'omit' from source: magic vars 43681 1727204720.22069: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.22079: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.22242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204720.24308: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204720.24370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204720.24406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204720.24439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204720.24462: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204720.24536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.24560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.24581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.24622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.24635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.24724: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.24736: Evaluated conditional (ansible_distribution_major_version | int > 9): True 43681 1727204720.24841: variable 'ansible_distribution' from source: facts 43681 1727204720.24845: variable '__network_rh_distros' from source: role '' defaults 43681 1727204720.24855: Evaluated conditional (ansible_distribution in __network_rh_distros): False 43681 1727204720.24858: when evaluation is False, skipping this task 43681 1727204720.24861: _execute() done 43681 1727204720.24865: dumping result to json 43681 1727204720.24869: done dumping result, returning 43681 1727204720.24877: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9e86-7728-000000000075] 43681 1727204720.24884: sending task result for task 12b410aa-8751-9e86-7728-000000000075 43681 1727204720.24981: done sending task result for task 12b410aa-8751-9e86-7728-000000000075 43681 1727204720.24984: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 43681 1727204720.25038: no more pending results, returning what we have 43681 1727204720.25043: results queue empty 43681 1727204720.25044: checking for any_errors_fatal 43681 1727204720.25049: done checking for any_errors_fatal 43681 1727204720.25050: checking for max_fail_percentage 43681 1727204720.25052: done checking for max_fail_percentage 43681 1727204720.25054: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.25055: done checking to see if all hosts have failed 43681 1727204720.25055: getting the remaining hosts for this loop 43681 1727204720.25057: done getting the remaining hosts for this loop 43681 1727204720.25062: getting the next task for host managed-node3 43681 1727204720.25068: done getting next task for host managed-node3 43681 1727204720.25072: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204720.25075: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.25101: getting variables 43681 1727204720.25103: in VariableManager get_vars() 43681 1727204720.25150: Calling all_inventory to load vars for managed-node3 43681 1727204720.25152: Calling groups_inventory to load vars for managed-node3 43681 1727204720.25155: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.25165: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.25169: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.25172: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.26638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.29022: done with get_vars() 43681 1727204720.29057: done getting variables 43681 1727204720.29128: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.083) 0:00:27.958 ***** 43681 1727204720.29162: entering _queue_task() for managed-node3/dnf 43681 1727204720.29505: worker is 1 (out of 1 available) 43681 1727204720.29523: exiting _queue_task() for managed-node3/dnf 43681 1727204720.29538: done queuing things up, now waiting for results queue to drain 43681 1727204720.29540: waiting for pending results... 43681 1727204720.29746: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204720.29847: in run() - task 12b410aa-8751-9e86-7728-000000000076 43681 1727204720.29859: variable 'ansible_search_path' from source: unknown 43681 1727204720.29862: variable 'ansible_search_path' from source: unknown 43681 1727204720.29903: calling self._execute() 43681 1727204720.29989: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.30000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.30010: variable 'omit' from source: magic vars 43681 1727204720.30341: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.30352: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.30532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204720.32339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204720.32404: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204720.32440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204720.32471: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204720.32495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204720.32569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.32594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.32616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.32654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.32666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.32771: variable 'ansible_distribution' from source: facts 43681 1727204720.32775: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.32783: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 43681 1727204720.32886: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204720.33003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.33026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.33051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.33084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.33101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.33140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.33165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.33187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.33220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.33234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.33267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.33296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.33319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.33349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.33361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.33500: variable 'network_connections' from source: play vars 43681 1727204720.33509: variable 'profile' from source: play vars 43681 1727204720.33567: variable 'profile' from source: play vars 43681 1727204720.33571: variable 'interface' from source: set_fact 43681 1727204720.33628: variable 'interface' from source: set_fact 43681 1727204720.33694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204720.33960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204720.33993: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204720.34022: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204720.34051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204720.34087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204720.34108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204720.34134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.34160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204720.34205: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204720.34408: variable 'network_connections' from source: play vars 43681 1727204720.34413: variable 'profile' from source: play vars 43681 1727204720.34466: variable 'profile' from source: play vars 43681 1727204720.34470: variable 'interface' from source: set_fact 43681 1727204720.34589: variable 'interface' from source: set_fact 43681 1727204720.34596: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204720.34598: when evaluation is False, skipping this task 43681 1727204720.34600: _execute() done 43681 1727204720.34602: dumping result to json 43681 1727204720.34604: done dumping result, returning 43681 1727204720.34606: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000076] 43681 1727204720.34608: sending task result for task 12b410aa-8751-9e86-7728-000000000076 43681 1727204720.34675: done sending task result for task 12b410aa-8751-9e86-7728-000000000076 43681 1727204720.34678: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204720.34751: no more pending results, returning what we have 43681 1727204720.34755: results queue empty 43681 1727204720.34756: checking for any_errors_fatal 43681 1727204720.34762: done checking for any_errors_fatal 43681 1727204720.34763: checking for max_fail_percentage 43681 1727204720.34765: done checking for max_fail_percentage 43681 1727204720.34766: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.34767: done checking to see if all hosts have failed 43681 1727204720.34768: getting the remaining hosts for this loop 43681 1727204720.34770: done getting the remaining hosts for this loop 43681 1727204720.34774: getting the next task for host managed-node3 43681 1727204720.34780: done getting next task for host managed-node3 43681 1727204720.34784: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204720.34788: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.34805: getting variables 43681 1727204720.34807: in VariableManager get_vars() 43681 1727204720.34848: Calling all_inventory to load vars for managed-node3 43681 1727204720.34851: Calling groups_inventory to load vars for managed-node3 43681 1727204720.34854: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.34863: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.34866: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.34870: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.36250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.37904: done with get_vars() 43681 1727204720.37931: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204720.38001: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.088) 0:00:28.046 ***** 43681 1727204720.38028: entering _queue_task() for managed-node3/yum 43681 1727204720.38302: worker is 1 (out of 1 available) 43681 1727204720.38321: exiting _queue_task() for managed-node3/yum 43681 1727204720.38334: done queuing things up, now waiting for results queue to drain 43681 1727204720.38336: waiting for pending results... 43681 1727204720.38534: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204720.38620: in run() - task 12b410aa-8751-9e86-7728-000000000077 43681 1727204720.38630: variable 'ansible_search_path' from source: unknown 43681 1727204720.38634: variable 'ansible_search_path' from source: unknown 43681 1727204720.38668: calling self._execute() 43681 1727204720.38777: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.38785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.38800: variable 'omit' from source: magic vars 43681 1727204720.39139: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.39151: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.39307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204720.41134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204720.41200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204720.41237: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204720.41268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204720.41294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204720.41368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.41394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.41422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.41453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.41466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.41553: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.41568: Evaluated conditional (ansible_distribution_major_version | int < 8): False 43681 1727204720.41572: when evaluation is False, skipping this task 43681 1727204720.41575: _execute() done 43681 1727204720.41579: dumping result to json 43681 1727204720.41584: done dumping result, returning 43681 1727204720.41594: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000077] 43681 1727204720.41610: sending task result for task 12b410aa-8751-9e86-7728-000000000077 43681 1727204720.41703: done sending task result for task 12b410aa-8751-9e86-7728-000000000077 43681 1727204720.41706: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 43681 1727204720.41791: no more pending results, returning what we have 43681 1727204720.41796: results queue empty 43681 1727204720.41797: checking for any_errors_fatal 43681 1727204720.41802: done checking for any_errors_fatal 43681 1727204720.41803: checking for max_fail_percentage 43681 1727204720.41805: done checking for max_fail_percentage 43681 1727204720.41807: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.41808: done checking to see if all hosts have failed 43681 1727204720.41808: getting the remaining hosts for this loop 43681 1727204720.41810: done getting the remaining hosts for this loop 43681 1727204720.41814: getting the next task for host managed-node3 43681 1727204720.41823: done getting next task for host managed-node3 43681 1727204720.41827: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204720.41829: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.41845: getting variables 43681 1727204720.41846: in VariableManager get_vars() 43681 1727204720.41886: Calling all_inventory to load vars for managed-node3 43681 1727204720.41896: Calling groups_inventory to load vars for managed-node3 43681 1727204720.41899: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.41909: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.41913: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.41918: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.43208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.45028: done with get_vars() 43681 1727204720.45074: done getting variables 43681 1727204720.45166: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.071) 0:00:28.118 ***** 43681 1727204720.45207: entering _queue_task() for managed-node3/fail 43681 1727204720.45682: worker is 1 (out of 1 available) 43681 1727204720.45809: exiting _queue_task() for managed-node3/fail 43681 1727204720.45841: done queuing things up, now waiting for results queue to drain 43681 1727204720.45844: waiting for pending results... 43681 1727204720.46205: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204720.46384: in run() - task 12b410aa-8751-9e86-7728-000000000078 43681 1727204720.46391: variable 'ansible_search_path' from source: unknown 43681 1727204720.46395: variable 'ansible_search_path' from source: unknown 43681 1727204720.46431: calling self._execute() 43681 1727204720.46519: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.46529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.46539: variable 'omit' from source: magic vars 43681 1727204720.46875: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.46886: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.46995: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204720.47187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204720.49596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204720.49600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204720.49603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204720.49626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204720.49664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204720.49769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.49819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.49859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.49923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.49949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.50036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.50071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.50111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.50168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.50193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.50256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.50294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.50334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.50410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.50434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.50696: variable 'network_connections' from source: play vars 43681 1727204720.50718: variable 'profile' from source: play vars 43681 1727204720.50830: variable 'profile' from source: play vars 43681 1727204720.50842: variable 'interface' from source: set_fact 43681 1727204720.50938: variable 'interface' from source: set_fact 43681 1727204720.51047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204720.51281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204720.51335: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204720.51404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204720.51455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204720.51573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204720.51577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204720.51595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.51635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204720.51710: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204720.52194: variable 'network_connections' from source: play vars 43681 1727204720.52198: variable 'profile' from source: play vars 43681 1727204720.52201: variable 'profile' from source: play vars 43681 1727204720.52203: variable 'interface' from source: set_fact 43681 1727204720.52270: variable 'interface' from source: set_fact 43681 1727204720.52309: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204720.52330: when evaluation is False, skipping this task 43681 1727204720.52339: _execute() done 43681 1727204720.52348: dumping result to json 43681 1727204720.52357: done dumping result, returning 43681 1727204720.52434: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000078] 43681 1727204720.52445: sending task result for task 12b410aa-8751-9e86-7728-000000000078 43681 1727204720.52525: done sending task result for task 12b410aa-8751-9e86-7728-000000000078 43681 1727204720.52529: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204720.52850: no more pending results, returning what we have 43681 1727204720.52854: results queue empty 43681 1727204720.52855: checking for any_errors_fatal 43681 1727204720.52861: done checking for any_errors_fatal 43681 1727204720.52862: checking for max_fail_percentage 43681 1727204720.52864: done checking for max_fail_percentage 43681 1727204720.52865: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.52867: done checking to see if all hosts have failed 43681 1727204720.52868: getting the remaining hosts for this loop 43681 1727204720.52869: done getting the remaining hosts for this loop 43681 1727204720.52874: getting the next task for host managed-node3 43681 1727204720.52880: done getting next task for host managed-node3 43681 1727204720.52885: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 43681 1727204720.52888: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.52931: getting variables 43681 1727204720.52934: in VariableManager get_vars() 43681 1727204720.52978: Calling all_inventory to load vars for managed-node3 43681 1727204720.52982: Calling groups_inventory to load vars for managed-node3 43681 1727204720.52984: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.53005: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.53009: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.53014: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.55783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.59150: done with get_vars() 43681 1727204720.59198: done getting variables 43681 1727204720.59283: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.141) 0:00:28.259 ***** 43681 1727204720.59325: entering _queue_task() for managed-node3/package 43681 1727204720.59743: worker is 1 (out of 1 available) 43681 1727204720.59759: exiting _queue_task() for managed-node3/package 43681 1727204720.59772: done queuing things up, now waiting for results queue to drain 43681 1727204720.59774: waiting for pending results... 43681 1727204720.60111: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 43681 1727204720.60260: in run() - task 12b410aa-8751-9e86-7728-000000000079 43681 1727204720.60331: variable 'ansible_search_path' from source: unknown 43681 1727204720.60335: variable 'ansible_search_path' from source: unknown 43681 1727204720.60395: calling self._execute() 43681 1727204720.60496: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.60511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.60529: variable 'omit' from source: magic vars 43681 1727204720.61035: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.61054: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.61348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204720.61755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204720.61766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204720.61815: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204720.61918: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204720.62072: variable 'network_packages' from source: role '' defaults 43681 1727204720.62238: variable '__network_provider_setup' from source: role '' defaults 43681 1727204720.62297: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204720.62361: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204720.62379: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204720.62473: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204720.62778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204720.65350: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204720.65595: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204720.65599: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204720.65602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204720.65759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204720.65870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.65916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.65966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.66027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.66062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.66126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.66172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.66214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.66277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.66304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.66652: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204720.66833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.66938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.66942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.66970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.66994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.67126: variable 'ansible_python' from source: facts 43681 1727204720.67177: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204720.67302: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204720.67423: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204720.67711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.67715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.67718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.67756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.67780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.67859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204720.67906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204720.67957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.68038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204720.68049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204720.68258: variable 'network_connections' from source: play vars 43681 1727204720.68363: variable 'profile' from source: play vars 43681 1727204720.68415: variable 'profile' from source: play vars 43681 1727204720.68430: variable 'interface' from source: set_fact 43681 1727204720.68528: variable 'interface' from source: set_fact 43681 1727204720.68631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204720.68670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204720.68724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204720.68769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204720.68851: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204720.69356: variable 'network_connections' from source: play vars 43681 1727204720.69361: variable 'profile' from source: play vars 43681 1727204720.69442: variable 'profile' from source: play vars 43681 1727204720.69461: variable 'interface' from source: set_fact 43681 1727204720.69561: variable 'interface' from source: set_fact 43681 1727204720.69621: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204720.69792: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204720.70210: variable 'network_connections' from source: play vars 43681 1727204720.70229: variable 'profile' from source: play vars 43681 1727204720.70320: variable 'profile' from source: play vars 43681 1727204720.70337: variable 'interface' from source: set_fact 43681 1727204720.70477: variable 'interface' from source: set_fact 43681 1727204720.70522: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204720.70639: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204720.71114: variable 'network_connections' from source: play vars 43681 1727204720.71195: variable 'profile' from source: play vars 43681 1727204720.71208: variable 'profile' from source: play vars 43681 1727204720.71229: variable 'interface' from source: set_fact 43681 1727204720.71365: variable 'interface' from source: set_fact 43681 1727204720.71451: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204720.71535: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204720.71560: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204720.71639: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204720.71975: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204720.72716: variable 'network_connections' from source: play vars 43681 1727204720.72729: variable 'profile' from source: play vars 43681 1727204720.72817: variable 'profile' from source: play vars 43681 1727204720.72827: variable 'interface' from source: set_fact 43681 1727204720.72926: variable 'interface' from source: set_fact 43681 1727204720.72941: variable 'ansible_distribution' from source: facts 43681 1727204720.72956: variable '__network_rh_distros' from source: role '' defaults 43681 1727204720.72972: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.72993: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204720.73235: variable 'ansible_distribution' from source: facts 43681 1727204720.73245: variable '__network_rh_distros' from source: role '' defaults 43681 1727204720.73255: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.73266: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204720.73594: variable 'ansible_distribution' from source: facts 43681 1727204720.73597: variable '__network_rh_distros' from source: role '' defaults 43681 1727204720.73599: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.73603: variable 'network_provider' from source: set_fact 43681 1727204720.73605: variable 'ansible_facts' from source: unknown 43681 1727204720.74881: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 43681 1727204720.74895: when evaluation is False, skipping this task 43681 1727204720.74906: _execute() done 43681 1727204720.74919: dumping result to json 43681 1727204720.74937: done dumping result, returning 43681 1727204720.74995: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9e86-7728-000000000079] 43681 1727204720.74998: sending task result for task 12b410aa-8751-9e86-7728-000000000079 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 43681 1727204720.75349: no more pending results, returning what we have 43681 1727204720.75353: results queue empty 43681 1727204720.75355: checking for any_errors_fatal 43681 1727204720.75363: done checking for any_errors_fatal 43681 1727204720.75364: checking for max_fail_percentage 43681 1727204720.75366: done checking for max_fail_percentage 43681 1727204720.75367: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.75369: done checking to see if all hosts have failed 43681 1727204720.75370: getting the remaining hosts for this loop 43681 1727204720.75371: done getting the remaining hosts for this loop 43681 1727204720.75376: getting the next task for host managed-node3 43681 1727204720.75382: done getting next task for host managed-node3 43681 1727204720.75387: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204720.75391: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.75409: getting variables 43681 1727204720.75411: in VariableManager get_vars() 43681 1727204720.75453: Calling all_inventory to load vars for managed-node3 43681 1727204720.75456: Calling groups_inventory to load vars for managed-node3 43681 1727204720.75459: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.75471: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.75480: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.75484: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.75616: done sending task result for task 12b410aa-8751-9e86-7728-000000000079 43681 1727204720.75619: WORKER PROCESS EXITING 43681 1727204720.78195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.81517: done with get_vars() 43681 1727204720.81564: done getting variables 43681 1727204720.81649: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.223) 0:00:28.483 ***** 43681 1727204720.81686: entering _queue_task() for managed-node3/package 43681 1727204720.82295: worker is 1 (out of 1 available) 43681 1727204720.82308: exiting _queue_task() for managed-node3/package 43681 1727204720.82320: done queuing things up, now waiting for results queue to drain 43681 1727204720.82322: waiting for pending results... 43681 1727204720.82472: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204720.82624: in run() - task 12b410aa-8751-9e86-7728-00000000007a 43681 1727204720.82647: variable 'ansible_search_path' from source: unknown 43681 1727204720.82666: variable 'ansible_search_path' from source: unknown 43681 1727204720.82713: calling self._execute() 43681 1727204720.82843: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.82859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.82888: variable 'omit' from source: magic vars 43681 1727204720.83369: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.83392: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.83571: variable 'network_state' from source: role '' defaults 43681 1727204720.83593: Evaluated conditional (network_state != {}): False 43681 1727204720.83603: when evaluation is False, skipping this task 43681 1727204720.83612: _execute() done 43681 1727204720.83620: dumping result to json 43681 1727204720.83629: done dumping result, returning 43681 1727204720.83652: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9e86-7728-00000000007a] 43681 1727204720.83665: sending task result for task 12b410aa-8751-9e86-7728-00000000007a 43681 1727204720.83864: done sending task result for task 12b410aa-8751-9e86-7728-00000000007a 43681 1727204720.83867: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204720.83929: no more pending results, returning what we have 43681 1727204720.83936: results queue empty 43681 1727204720.83937: checking for any_errors_fatal 43681 1727204720.83944: done checking for any_errors_fatal 43681 1727204720.83945: checking for max_fail_percentage 43681 1727204720.83949: done checking for max_fail_percentage 43681 1727204720.83951: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.83952: done checking to see if all hosts have failed 43681 1727204720.83953: getting the remaining hosts for this loop 43681 1727204720.83955: done getting the remaining hosts for this loop 43681 1727204720.83960: getting the next task for host managed-node3 43681 1727204720.83967: done getting next task for host managed-node3 43681 1727204720.83972: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204720.83975: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.83996: getting variables 43681 1727204720.83998: in VariableManager get_vars() 43681 1727204720.84043: Calling all_inventory to load vars for managed-node3 43681 1727204720.84046: Calling groups_inventory to load vars for managed-node3 43681 1727204720.84050: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.84064: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.84068: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.84072: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.86721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204720.90021: done with get_vars() 43681 1727204720.90066: done getting variables 43681 1727204720.90150: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.084) 0:00:28.568 ***** 43681 1727204720.90187: entering _queue_task() for managed-node3/package 43681 1727204720.90822: worker is 1 (out of 1 available) 43681 1727204720.90835: exiting _queue_task() for managed-node3/package 43681 1727204720.90847: done queuing things up, now waiting for results queue to drain 43681 1727204720.90849: waiting for pending results... 43681 1727204720.90958: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204720.91112: in run() - task 12b410aa-8751-9e86-7728-00000000007b 43681 1727204720.91138: variable 'ansible_search_path' from source: unknown 43681 1727204720.91149: variable 'ansible_search_path' from source: unknown 43681 1727204720.91207: calling self._execute() 43681 1727204720.91335: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204720.91411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204720.91420: variable 'omit' from source: magic vars 43681 1727204720.91873: variable 'ansible_distribution_major_version' from source: facts 43681 1727204720.91896: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204720.92074: variable 'network_state' from source: role '' defaults 43681 1727204720.92094: Evaluated conditional (network_state != {}): False 43681 1727204720.92103: when evaluation is False, skipping this task 43681 1727204720.92111: _execute() done 43681 1727204720.92120: dumping result to json 43681 1727204720.92128: done dumping result, returning 43681 1727204720.92141: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9e86-7728-00000000007b] 43681 1727204720.92175: sending task result for task 12b410aa-8751-9e86-7728-00000000007b skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204720.92453: no more pending results, returning what we have 43681 1727204720.92459: results queue empty 43681 1727204720.92461: checking for any_errors_fatal 43681 1727204720.92469: done checking for any_errors_fatal 43681 1727204720.92470: checking for max_fail_percentage 43681 1727204720.92473: done checking for max_fail_percentage 43681 1727204720.92475: checking to see if all hosts have failed and the running result is not ok 43681 1727204720.92476: done checking to see if all hosts have failed 43681 1727204720.92477: getting the remaining hosts for this loop 43681 1727204720.92479: done getting the remaining hosts for this loop 43681 1727204720.92484: getting the next task for host managed-node3 43681 1727204720.92497: done getting next task for host managed-node3 43681 1727204720.92507: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204720.92510: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204720.92528: getting variables 43681 1727204720.92530: in VariableManager get_vars() 43681 1727204720.92574: Calling all_inventory to load vars for managed-node3 43681 1727204720.92577: Calling groups_inventory to load vars for managed-node3 43681 1727204720.92580: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204720.92719: Calling all_plugins_play to load vars for managed-node3 43681 1727204720.92729: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204720.92735: done sending task result for task 12b410aa-8751-9e86-7728-00000000007b 43681 1727204720.92739: WORKER PROCESS EXITING 43681 1727204720.92744: Calling groups_plugins_play to load vars for managed-node3 43681 1727204720.96082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204721.00866: done with get_vars() 43681 1727204721.00916: done getting variables 43681 1727204721.01000: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.108) 0:00:28.677 ***** 43681 1727204721.01041: entering _queue_task() for managed-node3/service 43681 1727204721.01471: worker is 1 (out of 1 available) 43681 1727204721.01486: exiting _queue_task() for managed-node3/service 43681 1727204721.01637: done queuing things up, now waiting for results queue to drain 43681 1727204721.01640: waiting for pending results... 43681 1727204721.01866: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204721.02096: in run() - task 12b410aa-8751-9e86-7728-00000000007c 43681 1727204721.02106: variable 'ansible_search_path' from source: unknown 43681 1727204721.02109: variable 'ansible_search_path' from source: unknown 43681 1727204721.02122: calling self._execute() 43681 1727204721.02266: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204721.02316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204721.02324: variable 'omit' from source: magic vars 43681 1727204721.02843: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.02872: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204721.03044: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204721.03356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204721.06350: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204721.06471: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204721.06595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204721.06599: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204721.06612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204721.06726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.06774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.06814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.06884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.06910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.06986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.07024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.07096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.07130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.07163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.07230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.07294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.07311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.07475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.07479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.07664: variable 'network_connections' from source: play vars 43681 1727204721.07684: variable 'profile' from source: play vars 43681 1727204721.07796: variable 'profile' from source: play vars 43681 1727204721.07807: variable 'interface' from source: set_fact 43681 1727204721.07903: variable 'interface' from source: set_fact 43681 1727204721.08035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204721.08486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204721.08542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204721.08598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204721.08687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204721.08706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204721.08740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204721.08774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.08825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204721.08887: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204721.09494: variable 'network_connections' from source: play vars 43681 1727204721.09498: variable 'profile' from source: play vars 43681 1727204721.09501: variable 'profile' from source: play vars 43681 1727204721.09503: variable 'interface' from source: set_fact 43681 1727204721.09505: variable 'interface' from source: set_fact 43681 1727204721.09507: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204721.09510: when evaluation is False, skipping this task 43681 1727204721.09512: _execute() done 43681 1727204721.09514: dumping result to json 43681 1727204721.09516: done dumping result, returning 43681 1727204721.09518: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-00000000007c] 43681 1727204721.09530: sending task result for task 12b410aa-8751-9e86-7728-00000000007c 43681 1727204721.09764: done sending task result for task 12b410aa-8751-9e86-7728-00000000007c 43681 1727204721.09767: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204721.09823: no more pending results, returning what we have 43681 1727204721.09827: results queue empty 43681 1727204721.09828: checking for any_errors_fatal 43681 1727204721.09836: done checking for any_errors_fatal 43681 1727204721.09837: checking for max_fail_percentage 43681 1727204721.09840: done checking for max_fail_percentage 43681 1727204721.09841: checking to see if all hosts have failed and the running result is not ok 43681 1727204721.09842: done checking to see if all hosts have failed 43681 1727204721.09843: getting the remaining hosts for this loop 43681 1727204721.09845: done getting the remaining hosts for this loop 43681 1727204721.09850: getting the next task for host managed-node3 43681 1727204721.09865: done getting next task for host managed-node3 43681 1727204721.09871: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204721.09874: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204721.09893: getting variables 43681 1727204721.09896: in VariableManager get_vars() 43681 1727204721.09943: Calling all_inventory to load vars for managed-node3 43681 1727204721.09946: Calling groups_inventory to load vars for managed-node3 43681 1727204721.09950: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204721.09962: Calling all_plugins_play to load vars for managed-node3 43681 1727204721.09966: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204721.10094: Calling groups_plugins_play to load vars for managed-node3 43681 1727204721.12820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204721.16138: done with get_vars() 43681 1727204721.16192: done getting variables 43681 1727204721.16269: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.152) 0:00:28.829 ***** 43681 1727204721.16310: entering _queue_task() for managed-node3/service 43681 1727204721.16708: worker is 1 (out of 1 available) 43681 1727204721.16723: exiting _queue_task() for managed-node3/service 43681 1727204721.16737: done queuing things up, now waiting for results queue to drain 43681 1727204721.16738: waiting for pending results... 43681 1727204721.17117: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204721.17213: in run() - task 12b410aa-8751-9e86-7728-00000000007d 43681 1727204721.17239: variable 'ansible_search_path' from source: unknown 43681 1727204721.17248: variable 'ansible_search_path' from source: unknown 43681 1727204721.17294: calling self._execute() 43681 1727204721.17434: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204721.17542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204721.17546: variable 'omit' from source: magic vars 43681 1727204721.17935: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.17955: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204721.18196: variable 'network_provider' from source: set_fact 43681 1727204721.18212: variable 'network_state' from source: role '' defaults 43681 1727204721.18229: Evaluated conditional (network_provider == "nm" or network_state != {}): True 43681 1727204721.18240: variable 'omit' from source: magic vars 43681 1727204721.18286: variable 'omit' from source: magic vars 43681 1727204721.18337: variable 'network_service_name' from source: role '' defaults 43681 1727204721.18437: variable 'network_service_name' from source: role '' defaults 43681 1727204721.18580: variable '__network_provider_setup' from source: role '' defaults 43681 1727204721.18596: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204721.18682: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204721.18700: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204721.18785: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204721.19120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204721.21742: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204721.21877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204721.21906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204721.21954: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204721.21998: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204721.22195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.22199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.22203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.22243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.22267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.22339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.22376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.22414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.22476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.22502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.22866: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204721.23027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.23085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.23107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.23165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.23195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.23394: variable 'ansible_python' from source: facts 43681 1727204721.23398: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204721.23469: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204721.23583: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204721.23771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.23810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.23855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.23913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.23937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.24014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204721.24061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204721.24178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.24182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204721.24185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204721.24371: variable 'network_connections' from source: play vars 43681 1727204721.24385: variable 'profile' from source: play vars 43681 1727204721.24492: variable 'profile' from source: play vars 43681 1727204721.24506: variable 'interface' from source: set_fact 43681 1727204721.24595: variable 'interface' from source: set_fact 43681 1727204721.24794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204721.25015: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204721.25087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204721.25151: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204721.25240: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204721.25302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204721.25348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204721.25400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204721.25457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204721.25513: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204721.25994: variable 'network_connections' from source: play vars 43681 1727204721.25998: variable 'profile' from source: play vars 43681 1727204721.26048: variable 'profile' from source: play vars 43681 1727204721.26061: variable 'interface' from source: set_fact 43681 1727204721.26143: variable 'interface' from source: set_fact 43681 1727204721.26197: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204721.26313: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204721.26797: variable 'network_connections' from source: play vars 43681 1727204721.26805: variable 'profile' from source: play vars 43681 1727204721.26849: variable 'profile' from source: play vars 43681 1727204721.26861: variable 'interface' from source: set_fact 43681 1727204721.26959: variable 'interface' from source: set_fact 43681 1727204721.27001: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204721.27112: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204721.27531: variable 'network_connections' from source: play vars 43681 1727204721.27543: variable 'profile' from source: play vars 43681 1727204721.27642: variable 'profile' from source: play vars 43681 1727204721.27654: variable 'interface' from source: set_fact 43681 1727204721.27787: variable 'interface' from source: set_fact 43681 1727204721.27837: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204721.27924: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204721.27938: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204721.28025: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204721.28395: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204721.29067: variable 'network_connections' from source: play vars 43681 1727204721.29079: variable 'profile' from source: play vars 43681 1727204721.29167: variable 'profile' from source: play vars 43681 1727204721.29178: variable 'interface' from source: set_fact 43681 1727204721.29272: variable 'interface' from source: set_fact 43681 1727204721.29287: variable 'ansible_distribution' from source: facts 43681 1727204721.29301: variable '__network_rh_distros' from source: role '' defaults 43681 1727204721.29320: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.29342: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204721.29696: variable 'ansible_distribution' from source: facts 43681 1727204721.29699: variable '__network_rh_distros' from source: role '' defaults 43681 1727204721.29702: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.29704: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204721.29851: variable 'ansible_distribution' from source: facts 43681 1727204721.29862: variable '__network_rh_distros' from source: role '' defaults 43681 1727204721.29873: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.29929: variable 'network_provider' from source: set_fact 43681 1727204721.29967: variable 'omit' from source: magic vars 43681 1727204721.30009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204721.30057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204721.30084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204721.30148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204721.30154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204721.30178: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204721.30187: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204721.30199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204721.30336: Set connection var ansible_shell_type to sh 43681 1727204721.30365: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204721.30369: Set connection var ansible_timeout to 10 43681 1727204721.30387: Set connection var ansible_pipelining to False 43681 1727204721.30475: Set connection var ansible_connection to ssh 43681 1727204721.30478: Set connection var ansible_shell_executable to /bin/sh 43681 1727204721.30483: variable 'ansible_shell_executable' from source: unknown 43681 1727204721.30485: variable 'ansible_connection' from source: unknown 43681 1727204721.30488: variable 'ansible_module_compression' from source: unknown 43681 1727204721.30491: variable 'ansible_shell_type' from source: unknown 43681 1727204721.30494: variable 'ansible_shell_executable' from source: unknown 43681 1727204721.30496: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204721.30503: variable 'ansible_pipelining' from source: unknown 43681 1727204721.30506: variable 'ansible_timeout' from source: unknown 43681 1727204721.30508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204721.30694: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204721.30699: variable 'omit' from source: magic vars 43681 1727204721.30702: starting attempt loop 43681 1727204721.30705: running the handler 43681 1727204721.30798: variable 'ansible_facts' from source: unknown 43681 1727204721.32143: _low_level_execute_command(): starting 43681 1727204721.32287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204721.32921: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204721.32964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204721.32982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204721.33006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204721.33076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204721.33117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204721.33139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.33184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.33259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.35039: stdout chunk (state=3): >>>/root <<< 43681 1727204721.35215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204721.35253: stderr chunk (state=3): >>><<< 43681 1727204721.35271: stdout chunk (state=3): >>><<< 43681 1727204721.35327: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204721.35339: _low_level_execute_command(): starting 43681 1727204721.35440: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036 `" && echo ansible-tmp-1727204721.3530993-44886-83279284361036="` echo /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036 `" ) && sleep 0' 43681 1727204721.36112: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204721.36164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204721.36181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.36220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.36291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.43281: stdout chunk (state=3): >>>ansible-tmp-1727204721.3530993-44886-83279284361036=/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036 <<< 43681 1727204721.43697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204721.43701: stdout chunk (state=3): >>><<< 43681 1727204721.43704: stderr chunk (state=3): >>><<< 43681 1727204721.43706: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204721.3530993-44886-83279284361036=/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204721.43708: variable 'ansible_module_compression' from source: unknown 43681 1727204721.43710: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 43681 1727204721.43712: variable 'ansible_facts' from source: unknown 43681 1727204721.43960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py 43681 1727204721.44183: Sending initial data 43681 1727204721.44197: Sent initial data (155 bytes) 43681 1727204721.45047: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204721.45082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.45109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.45186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.46835: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204721.46885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204721.46932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpesy9ccus /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py <<< 43681 1727204721.46957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py" <<< 43681 1727204721.46989: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 43681 1727204721.47196: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpesy9ccus" to remote "/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py" <<< 43681 1727204721.49629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204721.49673: stderr chunk (state=3): >>><<< 43681 1727204721.49684: stdout chunk (state=3): >>><<< 43681 1727204721.49717: done transferring module to remote 43681 1727204721.49747: _low_level_execute_command(): starting 43681 1727204721.49758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/ /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py && sleep 0' 43681 1727204721.50595: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204721.50619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.50644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.50719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.52672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204721.52675: stdout chunk (state=3): >>><<< 43681 1727204721.52678: stderr chunk (state=3): >>><<< 43681 1727204721.52794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204721.52797: _low_level_execute_command(): starting 43681 1727204721.52802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/AnsiballZ_systemd.py && sleep 0' 43681 1727204721.53404: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204721.53443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204721.53477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.53481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.53545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.86548: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11902976", "MemoryAvailable": "infinity", "CPUUsageNSec": "2034483000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 43681 1727204721.86597: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 43681 1727204721.86601: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 43681 1727204721.88895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204721.88899: stderr chunk (state=3): >>><<< 43681 1727204721.88902: stdout chunk (state=3): >>><<< 43681 1727204721.88906: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11902976", "MemoryAvailable": "infinity", "CPUUsageNSec": "2034483000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204721.89034: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204721.89184: _low_level_execute_command(): starting 43681 1727204721.89353: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204721.3530993-44886-83279284361036/ > /dev/null 2>&1 && sleep 0' 43681 1727204721.90023: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204721.90048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204721.90067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204721.90092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204721.90112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204721.90125: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204721.90154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204721.90251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204721.90280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204721.90346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204721.92359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204721.92365: stdout chunk (state=3): >>><<< 43681 1727204721.92367: stderr chunk (state=3): >>><<< 43681 1727204721.92595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204721.92599: handler run complete 43681 1727204721.92602: attempt loop complete, returning result 43681 1727204721.92604: _execute() done 43681 1727204721.92606: dumping result to json 43681 1727204721.92608: done dumping result, returning 43681 1727204721.92610: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9e86-7728-00000000007d] 43681 1727204721.92612: sending task result for task 12b410aa-8751-9e86-7728-00000000007d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204721.93368: no more pending results, returning what we have 43681 1727204721.93372: results queue empty 43681 1727204721.93373: checking for any_errors_fatal 43681 1727204721.93381: done checking for any_errors_fatal 43681 1727204721.93382: checking for max_fail_percentage 43681 1727204721.93384: done checking for max_fail_percentage 43681 1727204721.93385: checking to see if all hosts have failed and the running result is not ok 43681 1727204721.93386: done checking to see if all hosts have failed 43681 1727204721.93388: getting the remaining hosts for this loop 43681 1727204721.93390: done getting the remaining hosts for this loop 43681 1727204721.93511: getting the next task for host managed-node3 43681 1727204721.93522: done getting next task for host managed-node3 43681 1727204721.93526: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204721.93529: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204721.93541: getting variables 43681 1727204721.93543: in VariableManager get_vars() 43681 1727204721.93579: Calling all_inventory to load vars for managed-node3 43681 1727204721.93582: Calling groups_inventory to load vars for managed-node3 43681 1727204721.93585: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204721.93637: Calling all_plugins_play to load vars for managed-node3 43681 1727204721.93642: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204721.93709: Calling groups_plugins_play to load vars for managed-node3 43681 1727204721.93727: done sending task result for task 12b410aa-8751-9e86-7728-00000000007d 43681 1727204721.93730: WORKER PROCESS EXITING 43681 1727204721.96017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204721.98442: done with get_vars() 43681 1727204721.98478: done getting variables 43681 1727204721.98553: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.822) 0:00:29.652 ***** 43681 1727204721.98591: entering _queue_task() for managed-node3/service 43681 1727204721.98973: worker is 1 (out of 1 available) 43681 1727204721.98988: exiting _queue_task() for managed-node3/service 43681 1727204721.99002: done queuing things up, now waiting for results queue to drain 43681 1727204721.99004: waiting for pending results... 43681 1727204721.99227: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204721.99316: in run() - task 12b410aa-8751-9e86-7728-00000000007e 43681 1727204721.99332: variable 'ansible_search_path' from source: unknown 43681 1727204721.99336: variable 'ansible_search_path' from source: unknown 43681 1727204721.99372: calling self._execute() 43681 1727204721.99466: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204721.99473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204721.99483: variable 'omit' from source: magic vars 43681 1727204721.99828: variable 'ansible_distribution_major_version' from source: facts 43681 1727204721.99840: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204721.99947: variable 'network_provider' from source: set_fact 43681 1727204721.99953: Evaluated conditional (network_provider == "nm"): True 43681 1727204722.00038: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204722.00113: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204722.00271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204722.02012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204722.02070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204722.02110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204722.02139: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204722.02163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204722.02250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204722.02276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204722.02302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204722.02340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204722.02353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204722.02393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204722.02416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204722.02442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204722.02473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204722.02486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204722.02527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204722.02550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204722.02571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204722.02605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204722.02621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204722.02747: variable 'network_connections' from source: play vars 43681 1727204722.02762: variable 'profile' from source: play vars 43681 1727204722.02829: variable 'profile' from source: play vars 43681 1727204722.02833: variable 'interface' from source: set_fact 43681 1727204722.02887: variable 'interface' from source: set_fact 43681 1727204722.02951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204722.03089: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204722.03122: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204722.03154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204722.03182: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204722.03230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204722.03250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204722.03271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204722.03296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204722.03341: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204722.03725: variable 'network_connections' from source: play vars 43681 1727204722.03729: variable 'profile' from source: play vars 43681 1727204722.03732: variable 'profile' from source: play vars 43681 1727204722.03734: variable 'interface' from source: set_fact 43681 1727204722.03832: variable 'interface' from source: set_fact 43681 1727204722.03835: Evaluated conditional (__network_wpa_supplicant_required): False 43681 1727204722.03838: when evaluation is False, skipping this task 43681 1727204722.03840: _execute() done 43681 1727204722.03850: dumping result to json 43681 1727204722.03853: done dumping result, returning 43681 1727204722.03855: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9e86-7728-00000000007e] 43681 1727204722.03858: sending task result for task 12b410aa-8751-9e86-7728-00000000007e 43681 1727204722.03936: done sending task result for task 12b410aa-8751-9e86-7728-00000000007e 43681 1727204722.03939: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 43681 1727204722.03994: no more pending results, returning what we have 43681 1727204722.03998: results queue empty 43681 1727204722.03999: checking for any_errors_fatal 43681 1727204722.04137: done checking for any_errors_fatal 43681 1727204722.04138: checking for max_fail_percentage 43681 1727204722.04140: done checking for max_fail_percentage 43681 1727204722.04141: checking to see if all hosts have failed and the running result is not ok 43681 1727204722.04142: done checking to see if all hosts have failed 43681 1727204722.04143: getting the remaining hosts for this loop 43681 1727204722.04144: done getting the remaining hosts for this loop 43681 1727204722.04148: getting the next task for host managed-node3 43681 1727204722.04154: done getting next task for host managed-node3 43681 1727204722.04158: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204722.04160: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204722.04174: getting variables 43681 1727204722.04175: in VariableManager get_vars() 43681 1727204722.04216: Calling all_inventory to load vars for managed-node3 43681 1727204722.04230: Calling groups_inventory to load vars for managed-node3 43681 1727204722.04233: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204722.04244: Calling all_plugins_play to load vars for managed-node3 43681 1727204722.04248: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204722.04252: Calling groups_plugins_play to load vars for managed-node3 43681 1727204722.06765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204722.09298: done with get_vars() 43681 1727204722.09334: done getting variables 43681 1727204722.09386: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.108) 0:00:29.760 ***** 43681 1727204722.09416: entering _queue_task() for managed-node3/service 43681 1727204722.09699: worker is 1 (out of 1 available) 43681 1727204722.09716: exiting _queue_task() for managed-node3/service 43681 1727204722.09729: done queuing things up, now waiting for results queue to drain 43681 1727204722.09731: waiting for pending results... 43681 1727204722.09931: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204722.10019: in run() - task 12b410aa-8751-9e86-7728-00000000007f 43681 1727204722.10034: variable 'ansible_search_path' from source: unknown 43681 1727204722.10038: variable 'ansible_search_path' from source: unknown 43681 1727204722.10071: calling self._execute() 43681 1727204722.10166: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204722.10173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204722.10186: variable 'omit' from source: magic vars 43681 1727204722.10523: variable 'ansible_distribution_major_version' from source: facts 43681 1727204722.10536: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204722.10639: variable 'network_provider' from source: set_fact 43681 1727204722.10645: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204722.10648: when evaluation is False, skipping this task 43681 1727204722.10652: _execute() done 43681 1727204722.10657: dumping result to json 43681 1727204722.10660: done dumping result, returning 43681 1727204722.10669: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9e86-7728-00000000007f] 43681 1727204722.10675: sending task result for task 12b410aa-8751-9e86-7728-00000000007f 43681 1727204722.10767: done sending task result for task 12b410aa-8751-9e86-7728-00000000007f 43681 1727204722.10770: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204722.10826: no more pending results, returning what we have 43681 1727204722.10831: results queue empty 43681 1727204722.10832: checking for any_errors_fatal 43681 1727204722.10843: done checking for any_errors_fatal 43681 1727204722.10844: checking for max_fail_percentage 43681 1727204722.10846: done checking for max_fail_percentage 43681 1727204722.10847: checking to see if all hosts have failed and the running result is not ok 43681 1727204722.10848: done checking to see if all hosts have failed 43681 1727204722.10849: getting the remaining hosts for this loop 43681 1727204722.10850: done getting the remaining hosts for this loop 43681 1727204722.10855: getting the next task for host managed-node3 43681 1727204722.10861: done getting next task for host managed-node3 43681 1727204722.10866: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204722.10869: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204722.10887: getting variables 43681 1727204722.10889: in VariableManager get_vars() 43681 1727204722.10930: Calling all_inventory to load vars for managed-node3 43681 1727204722.10933: Calling groups_inventory to load vars for managed-node3 43681 1727204722.10935: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204722.10946: Calling all_plugins_play to load vars for managed-node3 43681 1727204722.10949: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204722.10952: Calling groups_plugins_play to load vars for managed-node3 43681 1727204722.12371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204722.14007: done with get_vars() 43681 1727204722.14031: done getting variables 43681 1727204722.14084: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.046) 0:00:29.807 ***** 43681 1727204722.14112: entering _queue_task() for managed-node3/copy 43681 1727204722.14372: worker is 1 (out of 1 available) 43681 1727204722.14390: exiting _queue_task() for managed-node3/copy 43681 1727204722.14403: done queuing things up, now waiting for results queue to drain 43681 1727204722.14405: waiting for pending results... 43681 1727204722.14600: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204722.14683: in run() - task 12b410aa-8751-9e86-7728-000000000080 43681 1727204722.14696: variable 'ansible_search_path' from source: unknown 43681 1727204722.14699: variable 'ansible_search_path' from source: unknown 43681 1727204722.14733: calling self._execute() 43681 1727204722.14828: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204722.14834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204722.14846: variable 'omit' from source: magic vars 43681 1727204722.15170: variable 'ansible_distribution_major_version' from source: facts 43681 1727204722.15182: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204722.15284: variable 'network_provider' from source: set_fact 43681 1727204722.15288: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204722.15291: when evaluation is False, skipping this task 43681 1727204722.15306: _execute() done 43681 1727204722.15310: dumping result to json 43681 1727204722.15312: done dumping result, returning 43681 1727204722.15316: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9e86-7728-000000000080] 43681 1727204722.15320: sending task result for task 12b410aa-8751-9e86-7728-000000000080 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 43681 1727204722.15481: no more pending results, returning what we have 43681 1727204722.15486: results queue empty 43681 1727204722.15487: checking for any_errors_fatal 43681 1727204722.15496: done checking for any_errors_fatal 43681 1727204722.15497: checking for max_fail_percentage 43681 1727204722.15498: done checking for max_fail_percentage 43681 1727204722.15500: checking to see if all hosts have failed and the running result is not ok 43681 1727204722.15501: done checking to see if all hosts have failed 43681 1727204722.15502: getting the remaining hosts for this loop 43681 1727204722.15503: done getting the remaining hosts for this loop 43681 1727204722.15507: getting the next task for host managed-node3 43681 1727204722.15515: done getting next task for host managed-node3 43681 1727204722.15520: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204722.15526: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204722.15542: getting variables 43681 1727204722.15543: in VariableManager get_vars() 43681 1727204722.15579: Calling all_inventory to load vars for managed-node3 43681 1727204722.15582: Calling groups_inventory to load vars for managed-node3 43681 1727204722.15584: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204722.15604: Calling all_plugins_play to load vars for managed-node3 43681 1727204722.15608: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204722.15614: done sending task result for task 12b410aa-8751-9e86-7728-000000000080 43681 1727204722.15616: WORKER PROCESS EXITING 43681 1727204722.15623: Calling groups_plugins_play to load vars for managed-node3 43681 1727204722.16877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204722.18606: done with get_vars() 43681 1727204722.18630: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.045) 0:00:29.853 ***** 43681 1727204722.18704: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204722.18972: worker is 1 (out of 1 available) 43681 1727204722.18988: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204722.19003: done queuing things up, now waiting for results queue to drain 43681 1727204722.19005: waiting for pending results... 43681 1727204722.19208: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204722.19299: in run() - task 12b410aa-8751-9e86-7728-000000000081 43681 1727204722.19313: variable 'ansible_search_path' from source: unknown 43681 1727204722.19317: variable 'ansible_search_path' from source: unknown 43681 1727204722.19354: calling self._execute() 43681 1727204722.19446: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204722.19451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204722.19466: variable 'omit' from source: magic vars 43681 1727204722.19799: variable 'ansible_distribution_major_version' from source: facts 43681 1727204722.19811: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204722.19818: variable 'omit' from source: magic vars 43681 1727204722.19855: variable 'omit' from source: magic vars 43681 1727204722.20004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204722.21774: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204722.21833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204722.21869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204722.21904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204722.21931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204722.22007: variable 'network_provider' from source: set_fact 43681 1727204722.22132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204722.22168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204722.22191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204722.22232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204722.22245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204722.22312: variable 'omit' from source: magic vars 43681 1727204722.22417: variable 'omit' from source: magic vars 43681 1727204722.22506: variable 'network_connections' from source: play vars 43681 1727204722.22517: variable 'profile' from source: play vars 43681 1727204722.22581: variable 'profile' from source: play vars 43681 1727204722.22585: variable 'interface' from source: set_fact 43681 1727204722.22643: variable 'interface' from source: set_fact 43681 1727204722.22770: variable 'omit' from source: magic vars 43681 1727204722.22778: variable '__lsr_ansible_managed' from source: task vars 43681 1727204722.22833: variable '__lsr_ansible_managed' from source: task vars 43681 1727204722.23080: Loaded config def from plugin (lookup/template) 43681 1727204722.23084: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 43681 1727204722.23114: File lookup term: get_ansible_managed.j2 43681 1727204722.23118: variable 'ansible_search_path' from source: unknown 43681 1727204722.23123: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 43681 1727204722.23140: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 43681 1727204722.23156: variable 'ansible_search_path' from source: unknown 43681 1727204722.35564: variable 'ansible_managed' from source: unknown 43681 1727204722.35996: variable 'omit' from source: magic vars 43681 1727204722.36000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204722.36003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204722.36005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204722.36008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204722.36010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204722.36012: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204722.36014: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204722.36017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204722.36115: Set connection var ansible_shell_type to sh 43681 1727204722.36131: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204722.36146: Set connection var ansible_timeout to 10 43681 1727204722.36163: Set connection var ansible_pipelining to False 43681 1727204722.36176: Set connection var ansible_connection to ssh 43681 1727204722.36187: Set connection var ansible_shell_executable to /bin/sh 43681 1727204722.36227: variable 'ansible_shell_executable' from source: unknown 43681 1727204722.36241: variable 'ansible_connection' from source: unknown 43681 1727204722.36256: variable 'ansible_module_compression' from source: unknown 43681 1727204722.36277: variable 'ansible_shell_type' from source: unknown 43681 1727204722.36294: variable 'ansible_shell_executable' from source: unknown 43681 1727204722.36306: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204722.36309: variable 'ansible_pipelining' from source: unknown 43681 1727204722.36312: variable 'ansible_timeout' from source: unknown 43681 1727204722.36314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204722.36442: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204722.36458: variable 'omit' from source: magic vars 43681 1727204722.36464: starting attempt loop 43681 1727204722.36468: running the handler 43681 1727204722.36483: _low_level_execute_command(): starting 43681 1727204722.36488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204722.37416: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.37439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.39190: stdout chunk (state=3): >>>/root <<< 43681 1727204722.39362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.39413: stderr chunk (state=3): >>><<< 43681 1727204722.39417: stdout chunk (state=3): >>><<< 43681 1727204722.39446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204722.39461: _low_level_execute_command(): starting 43681 1727204722.39469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215 `" && echo ansible-tmp-1727204722.3944724-44923-126296046388215="` echo /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215 `" ) && sleep 0' 43681 1727204722.40793: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204722.40798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204722.40984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.40988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204722.41308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.41375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.43399: stdout chunk (state=3): >>>ansible-tmp-1727204722.3944724-44923-126296046388215=/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215 <<< 43681 1727204722.43599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.43609: stderr chunk (state=3): >>><<< 43681 1727204722.43612: stdout chunk (state=3): >>><<< 43681 1727204722.43640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204722.3944724-44923-126296046388215=/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204722.43795: variable 'ansible_module_compression' from source: unknown 43681 1727204722.43799: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 43681 1727204722.44005: variable 'ansible_facts' from source: unknown 43681 1727204722.44258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py 43681 1727204722.44597: Sending initial data 43681 1727204722.44601: Sent initial data (168 bytes) 43681 1727204722.46096: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204722.46100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.46103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204722.46105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.46155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204722.46161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204722.46499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.46556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.48188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204722.48202: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204722.48228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204722.48364: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp3bf83k__ /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py <<< 43681 1727204722.48368: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp3bf83k__" to remote "/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py" <<< 43681 1727204722.51364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.51369: stderr chunk (state=3): >>><<< 43681 1727204722.51372: stdout chunk (state=3): >>><<< 43681 1727204722.51460: done transferring module to remote 43681 1727204722.51463: _low_level_execute_command(): starting 43681 1727204722.51466: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/ /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py && sleep 0' 43681 1727204722.53005: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204722.53010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204722.53013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.53015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204722.53243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.53396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.55340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.55524: stdout chunk (state=3): >>><<< 43681 1727204722.55528: stderr chunk (state=3): >>><<< 43681 1727204722.55531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204722.55533: _low_level_execute_command(): starting 43681 1727204722.55536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/AnsiballZ_network_connections.py && sleep 0' 43681 1727204722.56645: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204722.57032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204722.57035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.57102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.91416: stdout chunk (state=3): >>> <<< 43681 1727204722.91435: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 43681 1727204722.93555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.93632: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204722.93674: stderr chunk (state=3): >>><<< 43681 1727204722.93682: stdout chunk (state=3): >>><<< 43681 1727204722.93706: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204722.93740: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204722.93749: _low_level_execute_command(): starting 43681 1727204722.93754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204722.3944724-44923-126296046388215/ > /dev/null 2>&1 && sleep 0' 43681 1727204722.94241: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204722.94245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.94247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204722.94250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204722.94252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204722.94300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204722.94304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204722.94349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204722.96256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204722.96313: stderr chunk (state=3): >>><<< 43681 1727204722.96323: stdout chunk (state=3): >>><<< 43681 1727204722.96342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204722.96349: handler run complete 43681 1727204722.96376: attempt loop complete, returning result 43681 1727204722.96379: _execute() done 43681 1727204722.96381: dumping result to json 43681 1727204722.96388: done dumping result, returning 43681 1727204722.96399: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9e86-7728-000000000081] 43681 1727204722.96416: sending task result for task 12b410aa-8751-9e86-7728-000000000081 43681 1727204722.96523: done sending task result for task 12b410aa-8751-9e86-7728-000000000081 43681 1727204722.96526: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 43681 1727204722.96658: no more pending results, returning what we have 43681 1727204722.96662: results queue empty 43681 1727204722.96663: checking for any_errors_fatal 43681 1727204722.96671: done checking for any_errors_fatal 43681 1727204722.96671: checking for max_fail_percentage 43681 1727204722.96673: done checking for max_fail_percentage 43681 1727204722.96674: checking to see if all hosts have failed and the running result is not ok 43681 1727204722.96675: done checking to see if all hosts have failed 43681 1727204722.96676: getting the remaining hosts for this loop 43681 1727204722.96678: done getting the remaining hosts for this loop 43681 1727204722.96681: getting the next task for host managed-node3 43681 1727204722.96687: done getting next task for host managed-node3 43681 1727204722.96694: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204722.96696: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204722.96713: getting variables 43681 1727204722.96714: in VariableManager get_vars() 43681 1727204722.96757: Calling all_inventory to load vars for managed-node3 43681 1727204722.96759: Calling groups_inventory to load vars for managed-node3 43681 1727204722.96762: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204722.96773: Calling all_plugins_play to load vars for managed-node3 43681 1727204722.96776: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204722.96779: Calling groups_plugins_play to load vars for managed-node3 43681 1727204722.98178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.01213: done with get_vars() 43681 1727204723.01264: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.826) 0:00:30.680 ***** 43681 1727204723.01372: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204723.01764: worker is 1 (out of 1 available) 43681 1727204723.01781: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204723.02198: done queuing things up, now waiting for results queue to drain 43681 1727204723.02202: waiting for pending results... 43681 1727204723.02407: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204723.02601: in run() - task 12b410aa-8751-9e86-7728-000000000082 43681 1727204723.02607: variable 'ansible_search_path' from source: unknown 43681 1727204723.02611: variable 'ansible_search_path' from source: unknown 43681 1727204723.02615: calling self._execute() 43681 1727204723.02619: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.02625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.02719: variable 'omit' from source: magic vars 43681 1727204723.03128: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.03139: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.03366: variable 'network_state' from source: role '' defaults 43681 1727204723.03370: Evaluated conditional (network_state != {}): False 43681 1727204723.03373: when evaluation is False, skipping this task 43681 1727204723.03376: _execute() done 43681 1727204723.03378: dumping result to json 43681 1727204723.03457: done dumping result, returning 43681 1727204723.03467: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9e86-7728-000000000082] 43681 1727204723.03474: sending task result for task 12b410aa-8751-9e86-7728-000000000082 43681 1727204723.03588: done sending task result for task 12b410aa-8751-9e86-7728-000000000082 43681 1727204723.03594: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204723.03659: no more pending results, returning what we have 43681 1727204723.03665: results queue empty 43681 1727204723.03666: checking for any_errors_fatal 43681 1727204723.03683: done checking for any_errors_fatal 43681 1727204723.03685: checking for max_fail_percentage 43681 1727204723.03687: done checking for max_fail_percentage 43681 1727204723.03691: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.03693: done checking to see if all hosts have failed 43681 1727204723.03694: getting the remaining hosts for this loop 43681 1727204723.03696: done getting the remaining hosts for this loop 43681 1727204723.03701: getting the next task for host managed-node3 43681 1727204723.03708: done getting next task for host managed-node3 43681 1727204723.03714: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204723.03718: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.03738: getting variables 43681 1727204723.03740: in VariableManager get_vars() 43681 1727204723.03788: Calling all_inventory to load vars for managed-node3 43681 1727204723.03996: Calling groups_inventory to load vars for managed-node3 43681 1727204723.04000: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.04014: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.04018: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.04022: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.07981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.14578: done with get_vars() 43681 1727204723.14631: done getting variables 43681 1727204723.14714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.133) 0:00:30.814 ***** 43681 1727204723.14762: entering _queue_task() for managed-node3/debug 43681 1727204723.15174: worker is 1 (out of 1 available) 43681 1727204723.15187: exiting _queue_task() for managed-node3/debug 43681 1727204723.15319: done queuing things up, now waiting for results queue to drain 43681 1727204723.15325: waiting for pending results... 43681 1727204723.15764: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204723.15769: in run() - task 12b410aa-8751-9e86-7728-000000000083 43681 1727204723.15772: variable 'ansible_search_path' from source: unknown 43681 1727204723.15774: variable 'ansible_search_path' from source: unknown 43681 1727204723.15809: calling self._execute() 43681 1727204723.15946: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.15969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.15987: variable 'omit' from source: magic vars 43681 1727204723.16517: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.16595: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.16598: variable 'omit' from source: magic vars 43681 1727204723.16614: variable 'omit' from source: magic vars 43681 1727204723.16681: variable 'omit' from source: magic vars 43681 1727204723.16744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204723.16800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204723.16833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204723.16873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.16894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.16957: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204723.16961: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.16963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.17107: Set connection var ansible_shell_type to sh 43681 1727204723.17176: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204723.17180: Set connection var ansible_timeout to 10 43681 1727204723.17182: Set connection var ansible_pipelining to False 43681 1727204723.17185: Set connection var ansible_connection to ssh 43681 1727204723.17191: Set connection var ansible_shell_executable to /bin/sh 43681 1727204723.17212: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.17223: variable 'ansible_connection' from source: unknown 43681 1727204723.17231: variable 'ansible_module_compression' from source: unknown 43681 1727204723.17238: variable 'ansible_shell_type' from source: unknown 43681 1727204723.17245: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.17252: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.17261: variable 'ansible_pipelining' from source: unknown 43681 1727204723.17267: variable 'ansible_timeout' from source: unknown 43681 1727204723.17284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.17498: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204723.17510: variable 'omit' from source: magic vars 43681 1727204723.17519: starting attempt loop 43681 1727204723.17609: running the handler 43681 1727204723.17716: variable '__network_connections_result' from source: set_fact 43681 1727204723.17794: handler run complete 43681 1727204723.17842: attempt loop complete, returning result 43681 1727204723.17852: _execute() done 43681 1727204723.17861: dumping result to json 43681 1727204723.17870: done dumping result, returning 43681 1727204723.17888: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9e86-7728-000000000083] 43681 1727204723.17903: sending task result for task 12b410aa-8751-9e86-7728-000000000083 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 43681 1727204723.18133: no more pending results, returning what we have 43681 1727204723.18139: results queue empty 43681 1727204723.18140: checking for any_errors_fatal 43681 1727204723.18153: done checking for any_errors_fatal 43681 1727204723.18155: checking for max_fail_percentage 43681 1727204723.18162: done checking for max_fail_percentage 43681 1727204723.18164: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.18165: done checking to see if all hosts have failed 43681 1727204723.18166: getting the remaining hosts for this loop 43681 1727204723.18168: done getting the remaining hosts for this loop 43681 1727204723.18173: getting the next task for host managed-node3 43681 1727204723.18181: done getting next task for host managed-node3 43681 1727204723.18186: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204723.18262: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.18282: getting variables 43681 1727204723.18285: in VariableManager get_vars() 43681 1727204723.18417: Calling all_inventory to load vars for managed-node3 43681 1727204723.18434: Calling groups_inventory to load vars for managed-node3 43681 1727204723.18439: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.18453: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.18458: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.18462: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.19110: done sending task result for task 12b410aa-8751-9e86-7728-000000000083 43681 1727204723.19113: WORKER PROCESS EXITING 43681 1727204723.22229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.31567: done with get_vars() 43681 1727204723.31599: done getting variables 43681 1727204723.31650: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.169) 0:00:30.983 ***** 43681 1727204723.31673: entering _queue_task() for managed-node3/debug 43681 1727204723.31965: worker is 1 (out of 1 available) 43681 1727204723.31981: exiting _queue_task() for managed-node3/debug 43681 1727204723.31996: done queuing things up, now waiting for results queue to drain 43681 1727204723.31998: waiting for pending results... 43681 1727204723.32204: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204723.32293: in run() - task 12b410aa-8751-9e86-7728-000000000084 43681 1727204723.32306: variable 'ansible_search_path' from source: unknown 43681 1727204723.32310: variable 'ansible_search_path' from source: unknown 43681 1727204723.32350: calling self._execute() 43681 1727204723.32440: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.32451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.32460: variable 'omit' from source: magic vars 43681 1727204723.32801: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.32811: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.32819: variable 'omit' from source: magic vars 43681 1727204723.32863: variable 'omit' from source: magic vars 43681 1727204723.33096: variable 'omit' from source: magic vars 43681 1727204723.33100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204723.33103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204723.33106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204723.33108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.33111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.33150: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204723.33162: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.33172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.33323: Set connection var ansible_shell_type to sh 43681 1727204723.33349: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204723.33363: Set connection var ansible_timeout to 10 43681 1727204723.33377: Set connection var ansible_pipelining to False 43681 1727204723.33388: Set connection var ansible_connection to ssh 43681 1727204723.33403: Set connection var ansible_shell_executable to /bin/sh 43681 1727204723.33448: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.33461: variable 'ansible_connection' from source: unknown 43681 1727204723.33470: variable 'ansible_module_compression' from source: unknown 43681 1727204723.33477: variable 'ansible_shell_type' from source: unknown 43681 1727204723.33485: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.33495: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.33562: variable 'ansible_pipelining' from source: unknown 43681 1727204723.33566: variable 'ansible_timeout' from source: unknown 43681 1727204723.33569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.33780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204723.33784: variable 'omit' from source: magic vars 43681 1727204723.33788: starting attempt loop 43681 1727204723.33791: running the handler 43681 1727204723.33861: variable '__network_connections_result' from source: set_fact 43681 1727204723.33954: variable '__network_connections_result' from source: set_fact 43681 1727204723.34053: handler run complete 43681 1727204723.34076: attempt loop complete, returning result 43681 1727204723.34080: _execute() done 43681 1727204723.34083: dumping result to json 43681 1727204723.34091: done dumping result, returning 43681 1727204723.34108: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9e86-7728-000000000084] 43681 1727204723.34114: sending task result for task 12b410aa-8751-9e86-7728-000000000084 43681 1727204723.34225: done sending task result for task 12b410aa-8751-9e86-7728-000000000084 43681 1727204723.34228: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 43681 1727204723.34329: no more pending results, returning what we have 43681 1727204723.34333: results queue empty 43681 1727204723.34334: checking for any_errors_fatal 43681 1727204723.34349: done checking for any_errors_fatal 43681 1727204723.34350: checking for max_fail_percentage 43681 1727204723.34353: done checking for max_fail_percentage 43681 1727204723.34354: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.34355: done checking to see if all hosts have failed 43681 1727204723.34356: getting the remaining hosts for this loop 43681 1727204723.34357: done getting the remaining hosts for this loop 43681 1727204723.34362: getting the next task for host managed-node3 43681 1727204723.34368: done getting next task for host managed-node3 43681 1727204723.34373: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204723.34375: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.34387: getting variables 43681 1727204723.34388: in VariableManager get_vars() 43681 1727204723.34428: Calling all_inventory to load vars for managed-node3 43681 1727204723.34431: Calling groups_inventory to load vars for managed-node3 43681 1727204723.34434: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.34443: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.34453: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.34458: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.35718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.37350: done with get_vars() 43681 1727204723.37377: done getting variables 43681 1727204723.37431: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.057) 0:00:31.041 ***** 43681 1727204723.37462: entering _queue_task() for managed-node3/debug 43681 1727204723.37734: worker is 1 (out of 1 available) 43681 1727204723.37751: exiting _queue_task() for managed-node3/debug 43681 1727204723.37764: done queuing things up, now waiting for results queue to drain 43681 1727204723.37766: waiting for pending results... 43681 1727204723.37972: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204723.38060: in run() - task 12b410aa-8751-9e86-7728-000000000085 43681 1727204723.38073: variable 'ansible_search_path' from source: unknown 43681 1727204723.38076: variable 'ansible_search_path' from source: unknown 43681 1727204723.38113: calling self._execute() 43681 1727204723.38207: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.38215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.38233: variable 'omit' from source: magic vars 43681 1727204723.38553: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.38564: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.38667: variable 'network_state' from source: role '' defaults 43681 1727204723.38679: Evaluated conditional (network_state != {}): False 43681 1727204723.38683: when evaluation is False, skipping this task 43681 1727204723.38686: _execute() done 43681 1727204723.38691: dumping result to json 43681 1727204723.38695: done dumping result, returning 43681 1727204723.38704: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9e86-7728-000000000085] 43681 1727204723.38710: sending task result for task 12b410aa-8751-9e86-7728-000000000085 43681 1727204723.38805: done sending task result for task 12b410aa-8751-9e86-7728-000000000085 43681 1727204723.38808: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 43681 1727204723.38863: no more pending results, returning what we have 43681 1727204723.38867: results queue empty 43681 1727204723.38868: checking for any_errors_fatal 43681 1727204723.38875: done checking for any_errors_fatal 43681 1727204723.38876: checking for max_fail_percentage 43681 1727204723.38878: done checking for max_fail_percentage 43681 1727204723.38879: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.38880: done checking to see if all hosts have failed 43681 1727204723.38881: getting the remaining hosts for this loop 43681 1727204723.38884: done getting the remaining hosts for this loop 43681 1727204723.38891: getting the next task for host managed-node3 43681 1727204723.38897: done getting next task for host managed-node3 43681 1727204723.38902: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204723.38905: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.38919: getting variables 43681 1727204723.38923: in VariableManager get_vars() 43681 1727204723.38960: Calling all_inventory to load vars for managed-node3 43681 1727204723.38963: Calling groups_inventory to load vars for managed-node3 43681 1727204723.38966: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.38976: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.38979: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.38982: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.40394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.42046: done with get_vars() 43681 1727204723.42078: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.047) 0:00:31.088 ***** 43681 1727204723.42163: entering _queue_task() for managed-node3/ping 43681 1727204723.42444: worker is 1 (out of 1 available) 43681 1727204723.42461: exiting _queue_task() for managed-node3/ping 43681 1727204723.42472: done queuing things up, now waiting for results queue to drain 43681 1727204723.42474: waiting for pending results... 43681 1727204723.42669: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204723.42761: in run() - task 12b410aa-8751-9e86-7728-000000000086 43681 1727204723.42774: variable 'ansible_search_path' from source: unknown 43681 1727204723.42777: variable 'ansible_search_path' from source: unknown 43681 1727204723.42816: calling self._execute() 43681 1727204723.42906: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.42914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.42927: variable 'omit' from source: magic vars 43681 1727204723.43262: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.43274: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.43280: variable 'omit' from source: magic vars 43681 1727204723.43316: variable 'omit' from source: magic vars 43681 1727204723.43348: variable 'omit' from source: magic vars 43681 1727204723.43386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204723.43423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204723.43441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204723.43457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.43469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.43501: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204723.43504: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.43510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.43595: Set connection var ansible_shell_type to sh 43681 1727204723.43604: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204723.43608: Set connection var ansible_timeout to 10 43681 1727204723.43617: Set connection var ansible_pipelining to False 43681 1727204723.43625: Set connection var ansible_connection to ssh 43681 1727204723.43631: Set connection var ansible_shell_executable to /bin/sh 43681 1727204723.43649: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.43652: variable 'ansible_connection' from source: unknown 43681 1727204723.43656: variable 'ansible_module_compression' from source: unknown 43681 1727204723.43661: variable 'ansible_shell_type' from source: unknown 43681 1727204723.43664: variable 'ansible_shell_executable' from source: unknown 43681 1727204723.43668: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.43673: variable 'ansible_pipelining' from source: unknown 43681 1727204723.43675: variable 'ansible_timeout' from source: unknown 43681 1727204723.43682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.43855: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204723.43865: variable 'omit' from source: magic vars 43681 1727204723.43871: starting attempt loop 43681 1727204723.43874: running the handler 43681 1727204723.43888: _low_level_execute_command(): starting 43681 1727204723.43898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204723.44443: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204723.44447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.44451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.44496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204723.44523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.44564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.46340: stdout chunk (state=3): >>>/root <<< 43681 1727204723.46449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204723.46508: stderr chunk (state=3): >>><<< 43681 1727204723.46511: stdout chunk (state=3): >>><<< 43681 1727204723.46539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204723.46553: _low_level_execute_command(): starting 43681 1727204723.46560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629 `" && echo ansible-tmp-1727204723.4653983-45038-127758515063629="` echo /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629 `" ) && sleep 0' 43681 1727204723.47045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204723.47050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.47054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204723.47064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.47106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204723.47109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.47158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.49162: stdout chunk (state=3): >>>ansible-tmp-1727204723.4653983-45038-127758515063629=/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629 <<< 43681 1727204723.49279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204723.49333: stderr chunk (state=3): >>><<< 43681 1727204723.49336: stdout chunk (state=3): >>><<< 43681 1727204723.49355: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204723.4653983-45038-127758515063629=/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204723.49412: variable 'ansible_module_compression' from source: unknown 43681 1727204723.49451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 43681 1727204723.49487: variable 'ansible_facts' from source: unknown 43681 1727204723.49548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py 43681 1727204723.49686: Sending initial data 43681 1727204723.49694: Sent initial data (153 bytes) 43681 1727204723.50137: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204723.50166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.50175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204723.50178: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.50237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204723.50241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.50269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.51890: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204723.51919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204723.51951: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpjev8egoy /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py <<< 43681 1727204723.51960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py" <<< 43681 1727204723.51986: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpjev8egoy" to remote "/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py" <<< 43681 1727204723.51995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py" <<< 43681 1727204723.52731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204723.52797: stderr chunk (state=3): >>><<< 43681 1727204723.52801: stdout chunk (state=3): >>><<< 43681 1727204723.52820: done transferring module to remote 43681 1727204723.52831: _low_level_execute_command(): starting 43681 1727204723.52837: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/ /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py && sleep 0' 43681 1727204723.53297: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204723.53334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204723.53338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204723.53341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.53400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204723.53404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.53447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.55379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204723.55383: stdout chunk (state=3): >>><<< 43681 1727204723.55385: stderr chunk (state=3): >>><<< 43681 1727204723.55388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204723.55395: _low_level_execute_command(): starting 43681 1727204723.55398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/AnsiballZ_ping.py && sleep 0' 43681 1727204723.56046: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204723.56050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204723.56052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204723.56055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204723.56066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204723.56074: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204723.56085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.56136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204723.56140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204723.56143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204723.56178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204723.56181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204723.56184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204723.56187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204723.56268: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204723.56274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.56291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204723.56300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.56377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.73381: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 43681 1727204723.74735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204723.74799: stderr chunk (state=3): >>><<< 43681 1727204723.74803: stdout chunk (state=3): >>><<< 43681 1727204723.74818: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204723.74846: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204723.74857: _low_level_execute_command(): starting 43681 1727204723.74865: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204723.4653983-45038-127758515063629/ > /dev/null 2>&1 && sleep 0' 43681 1727204723.75369: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204723.75372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204723.75375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204723.75377: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204723.75379: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204723.75438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204723.75442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204723.75446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204723.75485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204723.77395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204723.77451: stderr chunk (state=3): >>><<< 43681 1727204723.77455: stdout chunk (state=3): >>><<< 43681 1727204723.77471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204723.77479: handler run complete 43681 1727204723.77496: attempt loop complete, returning result 43681 1727204723.77499: _execute() done 43681 1727204723.77504: dumping result to json 43681 1727204723.77508: done dumping result, returning 43681 1727204723.77520: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9e86-7728-000000000086] 43681 1727204723.77528: sending task result for task 12b410aa-8751-9e86-7728-000000000086 43681 1727204723.77631: done sending task result for task 12b410aa-8751-9e86-7728-000000000086 43681 1727204723.77633: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 43681 1727204723.77702: no more pending results, returning what we have 43681 1727204723.77706: results queue empty 43681 1727204723.77707: checking for any_errors_fatal 43681 1727204723.77716: done checking for any_errors_fatal 43681 1727204723.77716: checking for max_fail_percentage 43681 1727204723.77718: done checking for max_fail_percentage 43681 1727204723.77720: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.77721: done checking to see if all hosts have failed 43681 1727204723.77722: getting the remaining hosts for this loop 43681 1727204723.77723: done getting the remaining hosts for this loop 43681 1727204723.77728: getting the next task for host managed-node3 43681 1727204723.77737: done getting next task for host managed-node3 43681 1727204723.77739: ^ task is: TASK: meta (role_complete) 43681 1727204723.77741: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.77753: getting variables 43681 1727204723.77755: in VariableManager get_vars() 43681 1727204723.77809: Calling all_inventory to load vars for managed-node3 43681 1727204723.77812: Calling groups_inventory to load vars for managed-node3 43681 1727204723.77815: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.77826: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.77830: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.77833: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.79161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.80927: done with get_vars() 43681 1727204723.80956: done getting variables 43681 1727204723.81031: done queuing things up, now waiting for results queue to drain 43681 1727204723.81032: results queue empty 43681 1727204723.81033: checking for any_errors_fatal 43681 1727204723.81035: done checking for any_errors_fatal 43681 1727204723.81036: checking for max_fail_percentage 43681 1727204723.81037: done checking for max_fail_percentage 43681 1727204723.81037: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.81038: done checking to see if all hosts have failed 43681 1727204723.81039: getting the remaining hosts for this loop 43681 1727204723.81039: done getting the remaining hosts for this loop 43681 1727204723.81042: getting the next task for host managed-node3 43681 1727204723.81046: done getting next task for host managed-node3 43681 1727204723.81047: ^ task is: TASK: meta (flush_handlers) 43681 1727204723.81049: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.81051: getting variables 43681 1727204723.81052: in VariableManager get_vars() 43681 1727204723.81064: Calling all_inventory to load vars for managed-node3 43681 1727204723.81066: Calling groups_inventory to load vars for managed-node3 43681 1727204723.81067: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.81072: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.81074: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.81076: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.82207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.83847: done with get_vars() 43681 1727204723.83885: done getting variables 43681 1727204723.83932: in VariableManager get_vars() 43681 1727204723.83944: Calling all_inventory to load vars for managed-node3 43681 1727204723.83946: Calling groups_inventory to load vars for managed-node3 43681 1727204723.83947: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.83952: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.83954: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.83957: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.85192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.86798: done with get_vars() 43681 1727204723.86831: done queuing things up, now waiting for results queue to drain 43681 1727204723.86833: results queue empty 43681 1727204723.86834: checking for any_errors_fatal 43681 1727204723.86835: done checking for any_errors_fatal 43681 1727204723.86836: checking for max_fail_percentage 43681 1727204723.86837: done checking for max_fail_percentage 43681 1727204723.86837: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.86838: done checking to see if all hosts have failed 43681 1727204723.86838: getting the remaining hosts for this loop 43681 1727204723.86839: done getting the remaining hosts for this loop 43681 1727204723.86842: getting the next task for host managed-node3 43681 1727204723.86845: done getting next task for host managed-node3 43681 1727204723.86846: ^ task is: TASK: meta (flush_handlers) 43681 1727204723.86848: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.86854: getting variables 43681 1727204723.86855: in VariableManager get_vars() 43681 1727204723.86866: Calling all_inventory to load vars for managed-node3 43681 1727204723.86867: Calling groups_inventory to load vars for managed-node3 43681 1727204723.86869: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.86874: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.86876: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.86878: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.88016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.90302: done with get_vars() 43681 1727204723.90338: done getting variables 43681 1727204723.90410: in VariableManager get_vars() 43681 1727204723.90427: Calling all_inventory to load vars for managed-node3 43681 1727204723.90430: Calling groups_inventory to load vars for managed-node3 43681 1727204723.90433: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.90440: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.90443: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.90447: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.92477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.94706: done with get_vars() 43681 1727204723.94744: done queuing things up, now waiting for results queue to drain 43681 1727204723.94746: results queue empty 43681 1727204723.94747: checking for any_errors_fatal 43681 1727204723.94748: done checking for any_errors_fatal 43681 1727204723.94748: checking for max_fail_percentage 43681 1727204723.94750: done checking for max_fail_percentage 43681 1727204723.94750: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.94751: done checking to see if all hosts have failed 43681 1727204723.94751: getting the remaining hosts for this loop 43681 1727204723.94753: done getting the remaining hosts for this loop 43681 1727204723.94756: getting the next task for host managed-node3 43681 1727204723.94759: done getting next task for host managed-node3 43681 1727204723.94759: ^ task is: None 43681 1727204723.94761: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.94762: done queuing things up, now waiting for results queue to drain 43681 1727204723.94762: results queue empty 43681 1727204723.94763: checking for any_errors_fatal 43681 1727204723.94763: done checking for any_errors_fatal 43681 1727204723.94764: checking for max_fail_percentage 43681 1727204723.94765: done checking for max_fail_percentage 43681 1727204723.94765: checking to see if all hosts have failed and the running result is not ok 43681 1727204723.94766: done checking to see if all hosts have failed 43681 1727204723.94767: getting the next task for host managed-node3 43681 1727204723.94768: done getting next task for host managed-node3 43681 1727204723.94769: ^ task is: None 43681 1727204723.94770: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.94812: in VariableManager get_vars() 43681 1727204723.94831: done with get_vars() 43681 1727204723.94837: in VariableManager get_vars() 43681 1727204723.94844: done with get_vars() 43681 1727204723.94848: variable 'omit' from source: magic vars 43681 1727204723.94872: in VariableManager get_vars() 43681 1727204723.94879: done with get_vars() 43681 1727204723.94898: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 43681 1727204723.95070: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 43681 1727204723.95098: getting the remaining hosts for this loop 43681 1727204723.95099: done getting the remaining hosts for this loop 43681 1727204723.95102: getting the next task for host managed-node3 43681 1727204723.95104: done getting next task for host managed-node3 43681 1727204723.95106: ^ task is: TASK: Gathering Facts 43681 1727204723.95107: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204723.95109: getting variables 43681 1727204723.95110: in VariableManager get_vars() 43681 1727204723.95117: Calling all_inventory to load vars for managed-node3 43681 1727204723.95119: Calling groups_inventory to load vars for managed-node3 43681 1727204723.95121: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204723.95126: Calling all_plugins_play to load vars for managed-node3 43681 1727204723.95128: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204723.95131: Calling groups_plugins_play to load vars for managed-node3 43681 1727204723.96632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204723.98400: done with get_vars() 43681 1727204723.98428: done getting variables 43681 1727204723.98472: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.563) 0:00:31.651 ***** 43681 1727204723.98497: entering _queue_task() for managed-node3/gather_facts 43681 1727204723.98770: worker is 1 (out of 1 available) 43681 1727204723.98784: exiting _queue_task() for managed-node3/gather_facts 43681 1727204723.98798: done queuing things up, now waiting for results queue to drain 43681 1727204723.98800: waiting for pending results... 43681 1727204723.99000: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204723.99085: in run() - task 12b410aa-8751-9e86-7728-00000000057e 43681 1727204723.99100: variable 'ansible_search_path' from source: unknown 43681 1727204723.99139: calling self._execute() 43681 1727204723.99229: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.99235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.99246: variable 'omit' from source: magic vars 43681 1727204723.99644: variable 'ansible_distribution_major_version' from source: facts 43681 1727204723.99656: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204723.99663: variable 'omit' from source: magic vars 43681 1727204723.99692: variable 'omit' from source: magic vars 43681 1727204723.99725: variable 'omit' from source: magic vars 43681 1727204723.99766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204723.99803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204723.99822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204723.99841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.99853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204723.99881: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204723.99885: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204723.99891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204723.99980: Set connection var ansible_shell_type to sh 43681 1727204723.99987: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204723.99995: Set connection var ansible_timeout to 10 43681 1727204724.00004: Set connection var ansible_pipelining to False 43681 1727204724.00013: Set connection var ansible_connection to ssh 43681 1727204724.00018: Set connection var ansible_shell_executable to /bin/sh 43681 1727204724.00042: variable 'ansible_shell_executable' from source: unknown 43681 1727204724.00045: variable 'ansible_connection' from source: unknown 43681 1727204724.00048: variable 'ansible_module_compression' from source: unknown 43681 1727204724.00052: variable 'ansible_shell_type' from source: unknown 43681 1727204724.00055: variable 'ansible_shell_executable' from source: unknown 43681 1727204724.00060: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204724.00065: variable 'ansible_pipelining' from source: unknown 43681 1727204724.00068: variable 'ansible_timeout' from source: unknown 43681 1727204724.00075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204724.00238: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204724.00249: variable 'omit' from source: magic vars 43681 1727204724.00255: starting attempt loop 43681 1727204724.00258: running the handler 43681 1727204724.00273: variable 'ansible_facts' from source: unknown 43681 1727204724.00297: _low_level_execute_command(): starting 43681 1727204724.00307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204724.00864: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.00869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204724.00872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.00941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204724.00944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204724.00947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.00976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.02745: stdout chunk (state=3): >>>/root <<< 43681 1727204724.02855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204724.02918: stderr chunk (state=3): >>><<< 43681 1727204724.02922: stdout chunk (state=3): >>><<< 43681 1727204724.02950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204724.02963: _low_level_execute_command(): starting 43681 1727204724.02971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149 `" && echo ansible-tmp-1727204724.0295024-45158-231814415745149="` echo /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149 `" ) && sleep 0' 43681 1727204724.03467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.03473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.03476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.03486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.03553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204724.03565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204724.03568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.03604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.05578: stdout chunk (state=3): >>>ansible-tmp-1727204724.0295024-45158-231814415745149=/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149 <<< 43681 1727204724.05692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204724.05758: stderr chunk (state=3): >>><<< 43681 1727204724.05763: stdout chunk (state=3): >>><<< 43681 1727204724.05782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204724.0295024-45158-231814415745149=/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204724.05816: variable 'ansible_module_compression' from source: unknown 43681 1727204724.05865: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204724.05927: variable 'ansible_facts' from source: unknown 43681 1727204724.06051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py 43681 1727204724.06196: Sending initial data 43681 1727204724.06200: Sent initial data (154 bytes) 43681 1727204724.06707: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.06711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204724.06714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204724.06717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.06766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204724.06769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.06812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.08401: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204724.08431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204724.08470: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp470lsk28 /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py <<< 43681 1727204724.08474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py" <<< 43681 1727204724.08502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp470lsk28" to remote "/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py" <<< 43681 1727204724.10142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204724.10231: stderr chunk (state=3): >>><<< 43681 1727204724.10235: stdout chunk (state=3): >>><<< 43681 1727204724.10261: done transferring module to remote 43681 1727204724.10272: _low_level_execute_command(): starting 43681 1727204724.10278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/ /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py && sleep 0' 43681 1727204724.10754: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204724.10757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.10760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.10762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.10826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204724.10829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.10861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.12699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204724.12756: stderr chunk (state=3): >>><<< 43681 1727204724.12760: stdout chunk (state=3): >>><<< 43681 1727204724.12780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204724.12783: _low_level_execute_command(): starting 43681 1727204724.12790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/AnsiballZ_setup.py && sleep 0' 43681 1727204724.13425: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.13555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.85204: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enfo<<< 43681 1727204724.85230: stdout chunk (state=3): >>>rcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.68603515625, "5m": 0.826171875, "15m": 0.54052734375}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "24", "epoch": "1727204724", "epoch_int": "1727204724", "date": "2024-09-24", "time": "15:05:24", "iso8601_micro": "2024-09-24T19:05:24.449907Z", "iso8601": "2024-09-24T19:05:24Z", "iso8601_basic": "20240924T150524449907", "iso8601_basic_short": "20240924T150524", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": <<< 43681 1727204724.85251: stdout chunk (state=3): >>>"", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1228, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139760128, "block_size": 4096, "block_total": 64479564, "block_available": 61313418, "block_used": 3166146, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "peerethtest0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fragl<<< 43681 1727204724.85273: stdout chunk (state=3): >>>ist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7ad5:7db7:8a46:12ed", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "mac<<< 43681 1727204724.85294: stdout chunk (state=3): >>>sec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204724.87411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204724.87480: stderr chunk (state=3): >>><<< 43681 1727204724.87483: stdout chunk (state=3): >>><<< 43681 1727204724.87531: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.68603515625, "5m": 0.826171875, "15m": 0.54052734375}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "24", "epoch": "1727204724", "epoch_int": "1727204724", "date": "2024-09-24", "time": "15:05:24", "iso8601_micro": "2024-09-24T19:05:24.449907Z", "iso8601": "2024-09-24T19:05:24Z", "iso8601_basic": "20240924T150524449907", "iso8601_basic_short": "20240924T150524", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3475, "used": 242}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1228, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139760128, "block_size": 4096, "block_total": 64479564, "block_available": 61313418, "block_used": 3166146, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "peerethtest0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "86:6c:78:87:31:5f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7ad5:7db7:8a46:12ed", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "b2:8f:09:74:fb:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b08f:9ff:fe74:fbc0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94", "fe80::7ad5:7db7:8a46:12ed", "fe80::b08f:9ff:fe74:fbc0"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204724.87954: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204724.87974: _low_level_execute_command(): starting 43681 1727204724.87985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204724.0295024-45158-231814415745149/ > /dev/null 2>&1 && sleep 0' 43681 1727204724.88474: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204724.88478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.88480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204724.88483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204724.88485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204724.88544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204724.88552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204724.88556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204724.88585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204724.90491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204724.90544: stderr chunk (state=3): >>><<< 43681 1727204724.90548: stdout chunk (state=3): >>><<< 43681 1727204724.90562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204724.90572: handler run complete 43681 1727204724.90720: variable 'ansible_facts' from source: unknown 43681 1727204724.90821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204724.91168: variable 'ansible_facts' from source: unknown 43681 1727204724.91258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204724.91400: attempt loop complete, returning result 43681 1727204724.91406: _execute() done 43681 1727204724.91409: dumping result to json 43681 1727204724.91441: done dumping result, returning 43681 1727204724.91450: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-00000000057e] 43681 1727204724.91456: sending task result for task 12b410aa-8751-9e86-7728-00000000057e 43681 1727204724.91884: done sending task result for task 12b410aa-8751-9e86-7728-00000000057e 43681 1727204724.91888: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204724.92268: no more pending results, returning what we have 43681 1727204724.92270: results queue empty 43681 1727204724.92271: checking for any_errors_fatal 43681 1727204724.92272: done checking for any_errors_fatal 43681 1727204724.92273: checking for max_fail_percentage 43681 1727204724.92274: done checking for max_fail_percentage 43681 1727204724.92275: checking to see if all hosts have failed and the running result is not ok 43681 1727204724.92275: done checking to see if all hosts have failed 43681 1727204724.92276: getting the remaining hosts for this loop 43681 1727204724.92277: done getting the remaining hosts for this loop 43681 1727204724.92280: getting the next task for host managed-node3 43681 1727204724.92284: done getting next task for host managed-node3 43681 1727204724.92285: ^ task is: TASK: meta (flush_handlers) 43681 1727204724.92286: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204724.92291: getting variables 43681 1727204724.92292: in VariableManager get_vars() 43681 1727204724.92311: Calling all_inventory to load vars for managed-node3 43681 1727204724.92313: Calling groups_inventory to load vars for managed-node3 43681 1727204724.92315: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204724.92326: Calling all_plugins_play to load vars for managed-node3 43681 1727204724.92329: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204724.92333: Calling groups_plugins_play to load vars for managed-node3 43681 1727204724.93645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204724.95274: done with get_vars() 43681 1727204724.95300: done getting variables 43681 1727204724.95360: in VariableManager get_vars() 43681 1727204724.95370: Calling all_inventory to load vars for managed-node3 43681 1727204724.95373: Calling groups_inventory to load vars for managed-node3 43681 1727204724.95375: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204724.95379: Calling all_plugins_play to load vars for managed-node3 43681 1727204724.95381: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204724.95383: Calling groups_plugins_play to load vars for managed-node3 43681 1727204724.96481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204724.98171: done with get_vars() 43681 1727204724.98201: done queuing things up, now waiting for results queue to drain 43681 1727204724.98203: results queue empty 43681 1727204724.98204: checking for any_errors_fatal 43681 1727204724.98207: done checking for any_errors_fatal 43681 1727204724.98208: checking for max_fail_percentage 43681 1727204724.98209: done checking for max_fail_percentage 43681 1727204724.98213: checking to see if all hosts have failed and the running result is not ok 43681 1727204724.98214: done checking to see if all hosts have failed 43681 1727204724.98215: getting the remaining hosts for this loop 43681 1727204724.98215: done getting the remaining hosts for this loop 43681 1727204724.98217: getting the next task for host managed-node3 43681 1727204724.98223: done getting next task for host managed-node3 43681 1727204724.98225: ^ task is: TASK: Include the task 'delete_interface.yml' 43681 1727204724.98226: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204724.98228: getting variables 43681 1727204724.98229: in VariableManager get_vars() 43681 1727204724.98236: Calling all_inventory to load vars for managed-node3 43681 1727204724.98238: Calling groups_inventory to load vars for managed-node3 43681 1727204724.98240: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204724.98244: Calling all_plugins_play to load vars for managed-node3 43681 1727204724.98246: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204724.98248: Calling groups_plugins_play to load vars for managed-node3 43681 1727204724.99354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.00948: done with get_vars() 43681 1727204725.00970: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 15:05:25 -0400 (0:00:01.025) 0:00:32.677 ***** 43681 1727204725.01042: entering _queue_task() for managed-node3/include_tasks 43681 1727204725.01330: worker is 1 (out of 1 available) 43681 1727204725.01345: exiting _queue_task() for managed-node3/include_tasks 43681 1727204725.01360: done queuing things up, now waiting for results queue to drain 43681 1727204725.01362: waiting for pending results... 43681 1727204725.01554: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 43681 1727204725.01633: in run() - task 12b410aa-8751-9e86-7728-000000000089 43681 1727204725.01647: variable 'ansible_search_path' from source: unknown 43681 1727204725.01680: calling self._execute() 43681 1727204725.01769: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.01777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.01787: variable 'omit' from source: magic vars 43681 1727204725.02132: variable 'ansible_distribution_major_version' from source: facts 43681 1727204725.02146: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204725.02150: _execute() done 43681 1727204725.02156: dumping result to json 43681 1727204725.02159: done dumping result, returning 43681 1727204725.02166: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [12b410aa-8751-9e86-7728-000000000089] 43681 1727204725.02172: sending task result for task 12b410aa-8751-9e86-7728-000000000089 43681 1727204725.02267: done sending task result for task 12b410aa-8751-9e86-7728-000000000089 43681 1727204725.02270: WORKER PROCESS EXITING 43681 1727204725.02307: no more pending results, returning what we have 43681 1727204725.02313: in VariableManager get_vars() 43681 1727204725.02349: Calling all_inventory to load vars for managed-node3 43681 1727204725.02352: Calling groups_inventory to load vars for managed-node3 43681 1727204725.02356: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.02371: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.02375: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.02381: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.03729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.05338: done with get_vars() 43681 1727204725.05358: variable 'ansible_search_path' from source: unknown 43681 1727204725.05371: we have included files to process 43681 1727204725.05371: generating all_blocks data 43681 1727204725.05372: done generating all_blocks data 43681 1727204725.05373: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 43681 1727204725.05374: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 43681 1727204725.05376: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 43681 1727204725.05570: done processing included file 43681 1727204725.05571: iterating over new_blocks loaded from include file 43681 1727204725.05572: in VariableManager get_vars() 43681 1727204725.05582: done with get_vars() 43681 1727204725.05583: filtering new block on tags 43681 1727204725.05597: done filtering new block on tags 43681 1727204725.05599: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 43681 1727204725.05603: extending task lists for all hosts with included blocks 43681 1727204725.05628: done extending task lists 43681 1727204725.05629: done processing included files 43681 1727204725.05630: results queue empty 43681 1727204725.05630: checking for any_errors_fatal 43681 1727204725.05632: done checking for any_errors_fatal 43681 1727204725.05632: checking for max_fail_percentage 43681 1727204725.05633: done checking for max_fail_percentage 43681 1727204725.05634: checking to see if all hosts have failed and the running result is not ok 43681 1727204725.05634: done checking to see if all hosts have failed 43681 1727204725.05635: getting the remaining hosts for this loop 43681 1727204725.05636: done getting the remaining hosts for this loop 43681 1727204725.05638: getting the next task for host managed-node3 43681 1727204725.05640: done getting next task for host managed-node3 43681 1727204725.05642: ^ task is: TASK: Remove test interface if necessary 43681 1727204725.05644: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.05647: getting variables 43681 1727204725.05648: in VariableManager get_vars() 43681 1727204725.05656: Calling all_inventory to load vars for managed-node3 43681 1727204725.05657: Calling groups_inventory to load vars for managed-node3 43681 1727204725.05659: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.05664: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.05666: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.05668: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.06780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.08424: done with get_vars() 43681 1727204725.08445: done getting variables 43681 1727204725.08485: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.074) 0:00:32.751 ***** 43681 1727204725.08511: entering _queue_task() for managed-node3/command 43681 1727204725.08784: worker is 1 (out of 1 available) 43681 1727204725.08802: exiting _queue_task() for managed-node3/command 43681 1727204725.08816: done queuing things up, now waiting for results queue to drain 43681 1727204725.08818: waiting for pending results... 43681 1727204725.09011: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 43681 1727204725.09094: in run() - task 12b410aa-8751-9e86-7728-00000000058f 43681 1727204725.09107: variable 'ansible_search_path' from source: unknown 43681 1727204725.09111: variable 'ansible_search_path' from source: unknown 43681 1727204725.09146: calling self._execute() 43681 1727204725.09236: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.09242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.09252: variable 'omit' from source: magic vars 43681 1727204725.09596: variable 'ansible_distribution_major_version' from source: facts 43681 1727204725.09605: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204725.09612: variable 'omit' from source: magic vars 43681 1727204725.09651: variable 'omit' from source: magic vars 43681 1727204725.09737: variable 'interface' from source: set_fact 43681 1727204725.09753: variable 'omit' from source: magic vars 43681 1727204725.09791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204725.09827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204725.09847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204725.09863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204725.09874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204725.09906: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204725.09910: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.09914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.10004: Set connection var ansible_shell_type to sh 43681 1727204725.10010: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204725.10017: Set connection var ansible_timeout to 10 43681 1727204725.10030: Set connection var ansible_pipelining to False 43681 1727204725.10039: Set connection var ansible_connection to ssh 43681 1727204725.10043: Set connection var ansible_shell_executable to /bin/sh 43681 1727204725.10064: variable 'ansible_shell_executable' from source: unknown 43681 1727204725.10067: variable 'ansible_connection' from source: unknown 43681 1727204725.10070: variable 'ansible_module_compression' from source: unknown 43681 1727204725.10072: variable 'ansible_shell_type' from source: unknown 43681 1727204725.10077: variable 'ansible_shell_executable' from source: unknown 43681 1727204725.10081: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.10086: variable 'ansible_pipelining' from source: unknown 43681 1727204725.10092: variable 'ansible_timeout' from source: unknown 43681 1727204725.10097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.10234: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204725.10245: variable 'omit' from source: magic vars 43681 1727204725.10251: starting attempt loop 43681 1727204725.10254: running the handler 43681 1727204725.10272: _low_level_execute_command(): starting 43681 1727204725.10284: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204725.10843: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.10847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204725.10852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.10896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.10916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.10921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.10961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.12712: stdout chunk (state=3): >>>/root <<< 43681 1727204725.12823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.12876: stderr chunk (state=3): >>><<< 43681 1727204725.12879: stdout chunk (state=3): >>><<< 43681 1727204725.12906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.12919: _low_level_execute_command(): starting 43681 1727204725.12927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640 `" && echo ansible-tmp-1727204725.1290636-45224-105032668938640="` echo /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640 `" ) && sleep 0' 43681 1727204725.13384: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.13388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.13394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.13406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.13454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.13461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.13506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.15476: stdout chunk (state=3): >>>ansible-tmp-1727204725.1290636-45224-105032668938640=/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640 <<< 43681 1727204725.15895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.15899: stdout chunk (state=3): >>><<< 43681 1727204725.15902: stderr chunk (state=3): >>><<< 43681 1727204725.15905: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204725.1290636-45224-105032668938640=/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.15908: variable 'ansible_module_compression' from source: unknown 43681 1727204725.15910: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204725.15912: variable 'ansible_facts' from source: unknown 43681 1727204725.15980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py 43681 1727204725.16167: Sending initial data 43681 1727204725.16266: Sent initial data (156 bytes) 43681 1727204725.17216: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204725.17255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.17368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.17397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.17415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.17447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.17683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.19210: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204725.19246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204725.19309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py" <<< 43681 1727204725.19340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpmz5zz0sx /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py <<< 43681 1727204725.19360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpmz5zz0sx" to remote "/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py" <<< 43681 1727204725.21079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.21088: stdout chunk (state=3): >>><<< 43681 1727204725.21093: stderr chunk (state=3): >>><<< 43681 1727204725.21095: done transferring module to remote 43681 1727204725.21098: _low_level_execute_command(): starting 43681 1727204725.21101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/ /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py && sleep 0' 43681 1727204725.21555: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.21558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204725.21560: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.21563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.21565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.21627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.21629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.21661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.23629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.23633: stdout chunk (state=3): >>><<< 43681 1727204725.23635: stderr chunk (state=3): >>><<< 43681 1727204725.23654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.23663: _low_level_execute_command(): starting 43681 1727204725.23674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/AnsiballZ_command.py && sleep 0' 43681 1727204725.24365: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204725.24484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.24500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.24530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.24599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.42944: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:05:25.415571", "end": "2024-09-24 15:05:25.422824", "delta": "0:00:00.007253", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204725.44985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204725.45049: stderr chunk (state=3): >>><<< 43681 1727204725.45054: stdout chunk (state=3): >>><<< 43681 1727204725.45072: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:05:25.415571", "end": "2024-09-24 15:05:25.422824", "delta": "0:00:00.007253", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204725.45119: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204725.45129: _low_level_execute_command(): starting 43681 1727204725.45135: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204725.1290636-45224-105032668938640/ > /dev/null 2>&1 && sleep 0' 43681 1727204725.45626: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.45631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204725.45634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204725.45636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204725.45640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.45694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.45698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.45742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.47692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.47737: stderr chunk (state=3): >>><<< 43681 1727204725.47740: stdout chunk (state=3): >>><<< 43681 1727204725.47754: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.47762: handler run complete 43681 1727204725.47783: Evaluated conditional (False): False 43681 1727204725.47798: attempt loop complete, returning result 43681 1727204725.47801: _execute() done 43681 1727204725.47808: dumping result to json 43681 1727204725.47814: done dumping result, returning 43681 1727204725.47822: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [12b410aa-8751-9e86-7728-00000000058f] 43681 1727204725.47830: sending task result for task 12b410aa-8751-9e86-7728-00000000058f 43681 1727204725.47938: done sending task result for task 12b410aa-8751-9e86-7728-00000000058f 43681 1727204725.47941: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.007253", "end": "2024-09-24 15:05:25.422824", "rc": 0, "start": "2024-09-24 15:05:25.415571" } 43681 1727204725.48021: no more pending results, returning what we have 43681 1727204725.48026: results queue empty 43681 1727204725.48027: checking for any_errors_fatal 43681 1727204725.48029: done checking for any_errors_fatal 43681 1727204725.48030: checking for max_fail_percentage 43681 1727204725.48032: done checking for max_fail_percentage 43681 1727204725.48033: checking to see if all hosts have failed and the running result is not ok 43681 1727204725.48034: done checking to see if all hosts have failed 43681 1727204725.48035: getting the remaining hosts for this loop 43681 1727204725.48037: done getting the remaining hosts for this loop 43681 1727204725.48042: getting the next task for host managed-node3 43681 1727204725.48058: done getting next task for host managed-node3 43681 1727204725.48060: ^ task is: TASK: meta (flush_handlers) 43681 1727204725.48062: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.48067: getting variables 43681 1727204725.48069: in VariableManager get_vars() 43681 1727204725.48101: Calling all_inventory to load vars for managed-node3 43681 1727204725.48104: Calling groups_inventory to load vars for managed-node3 43681 1727204725.48108: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.48120: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.48123: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.48127: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.49457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.51412: done with get_vars() 43681 1727204725.51455: done getting variables 43681 1727204725.51547: in VariableManager get_vars() 43681 1727204725.51563: Calling all_inventory to load vars for managed-node3 43681 1727204725.51566: Calling groups_inventory to load vars for managed-node3 43681 1727204725.51569: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.51576: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.51580: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.51584: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.53597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.55841: done with get_vars() 43681 1727204725.55872: done queuing things up, now waiting for results queue to drain 43681 1727204725.55874: results queue empty 43681 1727204725.55875: checking for any_errors_fatal 43681 1727204725.55879: done checking for any_errors_fatal 43681 1727204725.55880: checking for max_fail_percentage 43681 1727204725.55882: done checking for max_fail_percentage 43681 1727204725.55883: checking to see if all hosts have failed and the running result is not ok 43681 1727204725.55883: done checking to see if all hosts have failed 43681 1727204725.55884: getting the remaining hosts for this loop 43681 1727204725.55885: done getting the remaining hosts for this loop 43681 1727204725.55887: getting the next task for host managed-node3 43681 1727204725.55892: done getting next task for host managed-node3 43681 1727204725.55894: ^ task is: TASK: meta (flush_handlers) 43681 1727204725.55895: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.55897: getting variables 43681 1727204725.55898: in VariableManager get_vars() 43681 1727204725.55906: Calling all_inventory to load vars for managed-node3 43681 1727204725.55908: Calling groups_inventory to load vars for managed-node3 43681 1727204725.55910: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.55915: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.55917: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.55919: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.57295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.59327: done with get_vars() 43681 1727204725.59349: done getting variables 43681 1727204725.59397: in VariableManager get_vars() 43681 1727204725.59405: Calling all_inventory to load vars for managed-node3 43681 1727204725.59407: Calling groups_inventory to load vars for managed-node3 43681 1727204725.59409: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.59414: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.59416: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.59418: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.65691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.68613: done with get_vars() 43681 1727204725.68666: done queuing things up, now waiting for results queue to drain 43681 1727204725.68669: results queue empty 43681 1727204725.68670: checking for any_errors_fatal 43681 1727204725.68679: done checking for any_errors_fatal 43681 1727204725.68680: checking for max_fail_percentage 43681 1727204725.68681: done checking for max_fail_percentage 43681 1727204725.68682: checking to see if all hosts have failed and the running result is not ok 43681 1727204725.68683: done checking to see if all hosts have failed 43681 1727204725.68684: getting the remaining hosts for this loop 43681 1727204725.68685: done getting the remaining hosts for this loop 43681 1727204725.68690: getting the next task for host managed-node3 43681 1727204725.68694: done getting next task for host managed-node3 43681 1727204725.68696: ^ task is: None 43681 1727204725.68698: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.68699: done queuing things up, now waiting for results queue to drain 43681 1727204725.68700: results queue empty 43681 1727204725.68701: checking for any_errors_fatal 43681 1727204725.68702: done checking for any_errors_fatal 43681 1727204725.68702: checking for max_fail_percentage 43681 1727204725.68704: done checking for max_fail_percentage 43681 1727204725.68705: checking to see if all hosts have failed and the running result is not ok 43681 1727204725.68706: done checking to see if all hosts have failed 43681 1727204725.68707: getting the next task for host managed-node3 43681 1727204725.68709: done getting next task for host managed-node3 43681 1727204725.68710: ^ task is: None 43681 1727204725.68712: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.68750: in VariableManager get_vars() 43681 1727204725.68775: done with get_vars() 43681 1727204725.68783: in VariableManager get_vars() 43681 1727204725.68801: done with get_vars() 43681 1727204725.68806: variable 'omit' from source: magic vars 43681 1727204725.68923: variable 'profile' from source: play vars 43681 1727204725.69039: in VariableManager get_vars() 43681 1727204725.69056: done with get_vars() 43681 1727204725.69079: variable 'omit' from source: magic vars 43681 1727204725.69160: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 43681 1727204725.70063: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 43681 1727204725.70088: getting the remaining hosts for this loop 43681 1727204725.70091: done getting the remaining hosts for this loop 43681 1727204725.70094: getting the next task for host managed-node3 43681 1727204725.70097: done getting next task for host managed-node3 43681 1727204725.70100: ^ task is: TASK: Gathering Facts 43681 1727204725.70101: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204725.70104: getting variables 43681 1727204725.70105: in VariableManager get_vars() 43681 1727204725.70118: Calling all_inventory to load vars for managed-node3 43681 1727204725.70120: Calling groups_inventory to load vars for managed-node3 43681 1727204725.70125: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204725.70131: Calling all_plugins_play to load vars for managed-node3 43681 1727204725.70134: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204725.70137: Calling groups_plugins_play to load vars for managed-node3 43681 1727204725.72333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204725.75508: done with get_vars() 43681 1727204725.75556: done getting variables 43681 1727204725.75619: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.671) 0:00:33.423 ***** 43681 1727204725.75654: entering _queue_task() for managed-node3/gather_facts 43681 1727204725.76054: worker is 1 (out of 1 available) 43681 1727204725.76067: exiting _queue_task() for managed-node3/gather_facts 43681 1727204725.76081: done queuing things up, now waiting for results queue to drain 43681 1727204725.76083: waiting for pending results... 43681 1727204725.76423: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204725.76629: in run() - task 12b410aa-8751-9e86-7728-00000000059d 43681 1727204725.76633: variable 'ansible_search_path' from source: unknown 43681 1727204725.76636: calling self._execute() 43681 1727204725.76743: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.76760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.76777: variable 'omit' from source: magic vars 43681 1727204725.77307: variable 'ansible_distribution_major_version' from source: facts 43681 1727204725.77331: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204725.77344: variable 'omit' from source: magic vars 43681 1727204725.77406: variable 'omit' from source: magic vars 43681 1727204725.77461: variable 'omit' from source: magic vars 43681 1727204725.77524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204725.77617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204725.77623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204725.77637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204725.77656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204725.77699: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204725.77709: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.77717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.77894: Set connection var ansible_shell_type to sh 43681 1727204725.77898: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204725.77901: Set connection var ansible_timeout to 10 43681 1727204725.77903: Set connection var ansible_pipelining to False 43681 1727204725.77906: Set connection var ansible_connection to ssh 43681 1727204725.77908: Set connection var ansible_shell_executable to /bin/sh 43681 1727204725.77941: variable 'ansible_shell_executable' from source: unknown 43681 1727204725.77951: variable 'ansible_connection' from source: unknown 43681 1727204725.77962: variable 'ansible_module_compression' from source: unknown 43681 1727204725.77971: variable 'ansible_shell_type' from source: unknown 43681 1727204725.77979: variable 'ansible_shell_executable' from source: unknown 43681 1727204725.77987: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204725.77998: variable 'ansible_pipelining' from source: unknown 43681 1727204725.78006: variable 'ansible_timeout' from source: unknown 43681 1727204725.78015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204725.78246: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204725.78258: variable 'omit' from source: magic vars 43681 1727204725.78264: starting attempt loop 43681 1727204725.78267: running the handler 43681 1727204725.78282: variable 'ansible_facts' from source: unknown 43681 1727204725.78304: _low_level_execute_command(): starting 43681 1727204725.78312: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204725.78860: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.78866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.78870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.78919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.78927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.78973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.80718: stdout chunk (state=3): >>>/root <<< 43681 1727204725.80827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.80880: stderr chunk (state=3): >>><<< 43681 1727204725.80884: stdout chunk (state=3): >>><<< 43681 1727204725.80910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.80925: _low_level_execute_command(): starting 43681 1727204725.80929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240 `" && echo ansible-tmp-1727204725.809101-45293-13336983240="` echo /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240 `" ) && sleep 0' 43681 1727204725.81379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.81383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.81386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.81398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.81447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.81452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.81493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.83479: stdout chunk (state=3): >>>ansible-tmp-1727204725.809101-45293-13336983240=/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240 <<< 43681 1727204725.83599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.83647: stderr chunk (state=3): >>><<< 43681 1727204725.83650: stdout chunk (state=3): >>><<< 43681 1727204725.83666: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204725.809101-45293-13336983240=/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.83695: variable 'ansible_module_compression' from source: unknown 43681 1727204725.83737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204725.83797: variable 'ansible_facts' from source: unknown 43681 1727204725.83916: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py 43681 1727204725.84035: Sending initial data 43681 1727204725.84038: Sent initial data (149 bytes) 43681 1727204725.84482: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.84485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.84488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204725.84493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204725.84495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.84547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204725.84552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.84588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.86188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204725.86204: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204725.86223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204725.86255: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpij3attui /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py <<< 43681 1727204725.86265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py" <<< 43681 1727204725.86291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpij3attui" to remote "/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py" <<< 43681 1727204725.86299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py" <<< 43681 1727204725.88471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.88475: stdout chunk (state=3): >>><<< 43681 1727204725.88477: stderr chunk (state=3): >>><<< 43681 1727204725.88479: done transferring module to remote 43681 1727204725.88481: _low_level_execute_command(): starting 43681 1727204725.88484: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/ /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py && sleep 0' 43681 1727204725.88898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.88913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204725.88927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.88974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.89002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.89031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204725.90886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204725.90945: stderr chunk (state=3): >>><<< 43681 1727204725.90948: stdout chunk (state=3): >>><<< 43681 1727204725.90964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204725.90968: _low_level_execute_command(): starting 43681 1727204725.90973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/AnsiballZ_setup.py && sleep 0' 43681 1727204725.91441: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.91445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204725.91447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204725.91449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204725.91454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204725.91496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204725.91516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204725.91562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204726.60817: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_<<< 43681 1727204726.60856: stdout chunk (state=3): >>>cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3477, "used": 240}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1230, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139743744, "block_size": 4096, "block_total": 64479564, "block_available": 61313414, "block_used": 3166150, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "26", "epoch": "1727204726", "epoch_int": "1727204726", "date": "2024-09-24", "time": "15:05:26", "iso8601_micro": "2024-09-24T19:05:26.563417Z", "iso8601": "2024-09-24T19:05:26Z", "iso8601_basic": "20240924T150526563417", "iso8601_basic_short": "20240924T150526", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.71142578125, "5m": 0.8291015625, "15m": 0.54345703125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204726.63050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204726.63054: stdout chunk (state=3): >>><<< 43681 1727204726.63057: stderr chunk (state=3): >>><<< 43681 1727204726.63103: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3477, "used": 240}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1230, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139743744, "block_size": 4096, "block_total": 64479564, "block_available": 61313414, "block_used": 3166150, "inode_total": 16384000, "inode_available": 16302069, "inode_used": 81931, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "26", "epoch": "1727204726", "epoch_int": "1727204726", "date": "2024-09-24", "time": "15:05:26", "iso8601_micro": "2024-09-24T19:05:26.563417Z", "iso8601": "2024-09-24T19:05:26Z", "iso8601_basic": "20240924T150526563417", "iso8601_basic_short": "20240924T150526", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.71142578125, "5m": 0.8291015625, "15m": 0.54345703125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204726.63796: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204726.63801: _low_level_execute_command(): starting 43681 1727204726.63805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204725.809101-45293-13336983240/ > /dev/null 2>&1 && sleep 0' 43681 1727204726.64446: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204726.64468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204726.64490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204726.64587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204726.64630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204726.64648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204726.64671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204726.64747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204726.66797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204726.66801: stdout chunk (state=3): >>><<< 43681 1727204726.66804: stderr chunk (state=3): >>><<< 43681 1727204726.66806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204726.66815: handler run complete 43681 1727204726.67095: variable 'ansible_facts' from source: unknown 43681 1727204726.67204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.67747: variable 'ansible_facts' from source: unknown 43681 1727204726.67885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.68129: attempt loop complete, returning result 43681 1727204726.68140: _execute() done 43681 1727204726.68148: dumping result to json 43681 1727204726.68195: done dumping result, returning 43681 1727204726.68208: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-00000000059d] 43681 1727204726.68223: sending task result for task 12b410aa-8751-9e86-7728-00000000059d 43681 1727204726.68769: done sending task result for task 12b410aa-8751-9e86-7728-00000000059d 43681 1727204726.68772: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204726.69517: no more pending results, returning what we have 43681 1727204726.69521: results queue empty 43681 1727204726.69522: checking for any_errors_fatal 43681 1727204726.69523: done checking for any_errors_fatal 43681 1727204726.69524: checking for max_fail_percentage 43681 1727204726.69531: done checking for max_fail_percentage 43681 1727204726.69532: checking to see if all hosts have failed and the running result is not ok 43681 1727204726.69533: done checking to see if all hosts have failed 43681 1727204726.69534: getting the remaining hosts for this loop 43681 1727204726.69535: done getting the remaining hosts for this loop 43681 1727204726.69539: getting the next task for host managed-node3 43681 1727204726.69545: done getting next task for host managed-node3 43681 1727204726.69547: ^ task is: TASK: meta (flush_handlers) 43681 1727204726.69550: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204726.69554: getting variables 43681 1727204726.69556: in VariableManager get_vars() 43681 1727204726.69586: Calling all_inventory to load vars for managed-node3 43681 1727204726.69591: Calling groups_inventory to load vars for managed-node3 43681 1727204726.69594: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204726.69606: Calling all_plugins_play to load vars for managed-node3 43681 1727204726.69609: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204726.69613: Calling groups_plugins_play to load vars for managed-node3 43681 1727204726.71834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.77119: done with get_vars() 43681 1727204726.77231: done getting variables 43681 1727204726.77421: in VariableManager get_vars() 43681 1727204726.77438: Calling all_inventory to load vars for managed-node3 43681 1727204726.77441: Calling groups_inventory to load vars for managed-node3 43681 1727204726.77444: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204726.77450: Calling all_plugins_play to load vars for managed-node3 43681 1727204726.77454: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204726.77458: Calling groups_plugins_play to load vars for managed-node3 43681 1727204726.79829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.82941: done with get_vars() 43681 1727204726.82995: done queuing things up, now waiting for results queue to drain 43681 1727204726.82998: results queue empty 43681 1727204726.82999: checking for any_errors_fatal 43681 1727204726.83006: done checking for any_errors_fatal 43681 1727204726.83007: checking for max_fail_percentage 43681 1727204726.83008: done checking for max_fail_percentage 43681 1727204726.83009: checking to see if all hosts have failed and the running result is not ok 43681 1727204726.83015: done checking to see if all hosts have failed 43681 1727204726.83016: getting the remaining hosts for this loop 43681 1727204726.83018: done getting the remaining hosts for this loop 43681 1727204726.83021: getting the next task for host managed-node3 43681 1727204726.83026: done getting next task for host managed-node3 43681 1727204726.83030: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204726.83032: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204726.83044: getting variables 43681 1727204726.83046: in VariableManager get_vars() 43681 1727204726.83064: Calling all_inventory to load vars for managed-node3 43681 1727204726.83067: Calling groups_inventory to load vars for managed-node3 43681 1727204726.83070: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204726.83076: Calling all_plugins_play to load vars for managed-node3 43681 1727204726.83079: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204726.83083: Calling groups_plugins_play to load vars for managed-node3 43681 1727204726.85274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.88378: done with get_vars() 43681 1727204726.88419: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:26 -0400 (0:00:01.128) 0:00:34.551 ***** 43681 1727204726.88526: entering _queue_task() for managed-node3/include_tasks 43681 1727204726.89123: worker is 1 (out of 1 available) 43681 1727204726.89134: exiting _queue_task() for managed-node3/include_tasks 43681 1727204726.89147: done queuing things up, now waiting for results queue to drain 43681 1727204726.89149: waiting for pending results... 43681 1727204726.89614: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 43681 1727204726.89620: in run() - task 12b410aa-8751-9e86-7728-000000000091 43681 1727204726.89624: variable 'ansible_search_path' from source: unknown 43681 1727204726.89627: variable 'ansible_search_path' from source: unknown 43681 1727204726.89630: calling self._execute() 43681 1727204726.89637: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204726.89654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204726.89671: variable 'omit' from source: magic vars 43681 1727204726.90155: variable 'ansible_distribution_major_version' from source: facts 43681 1727204726.90178: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204726.90255: _execute() done 43681 1727204726.90258: dumping result to json 43681 1727204726.90261: done dumping result, returning 43681 1727204726.90264: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9e86-7728-000000000091] 43681 1727204726.90267: sending task result for task 12b410aa-8751-9e86-7728-000000000091 43681 1727204726.90404: no more pending results, returning what we have 43681 1727204726.90410: in VariableManager get_vars() 43681 1727204726.90458: Calling all_inventory to load vars for managed-node3 43681 1727204726.90461: Calling groups_inventory to load vars for managed-node3 43681 1727204726.90469: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204726.90485: Calling all_plugins_play to load vars for managed-node3 43681 1727204726.90491: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204726.90495: Calling groups_plugins_play to load vars for managed-node3 43681 1727204726.91210: done sending task result for task 12b410aa-8751-9e86-7728-000000000091 43681 1727204726.91214: WORKER PROCESS EXITING 43681 1727204726.93135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204726.96229: done with get_vars() 43681 1727204726.96268: variable 'ansible_search_path' from source: unknown 43681 1727204726.96269: variable 'ansible_search_path' from source: unknown 43681 1727204726.96305: we have included files to process 43681 1727204726.96307: generating all_blocks data 43681 1727204726.96309: done generating all_blocks data 43681 1727204726.96310: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204726.96311: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204726.96314: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 43681 1727204726.97102: done processing included file 43681 1727204726.97104: iterating over new_blocks loaded from include file 43681 1727204726.97106: in VariableManager get_vars() 43681 1727204726.97137: done with get_vars() 43681 1727204726.97139: filtering new block on tags 43681 1727204726.97160: done filtering new block on tags 43681 1727204726.97163: in VariableManager get_vars() 43681 1727204726.97186: done with get_vars() 43681 1727204726.97188: filtering new block on tags 43681 1727204726.97213: done filtering new block on tags 43681 1727204726.97216: in VariableManager get_vars() 43681 1727204726.97244: done with get_vars() 43681 1727204726.97246: filtering new block on tags 43681 1727204726.97266: done filtering new block on tags 43681 1727204726.97268: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 43681 1727204726.97275: extending task lists for all hosts with included blocks 43681 1727204726.97796: done extending task lists 43681 1727204726.97798: done processing included files 43681 1727204726.97798: results queue empty 43681 1727204726.97799: checking for any_errors_fatal 43681 1727204726.97801: done checking for any_errors_fatal 43681 1727204726.97802: checking for max_fail_percentage 43681 1727204726.97803: done checking for max_fail_percentage 43681 1727204726.97804: checking to see if all hosts have failed and the running result is not ok 43681 1727204726.97806: done checking to see if all hosts have failed 43681 1727204726.97807: getting the remaining hosts for this loop 43681 1727204726.97808: done getting the remaining hosts for this loop 43681 1727204726.97811: getting the next task for host managed-node3 43681 1727204726.97815: done getting next task for host managed-node3 43681 1727204726.97818: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204726.97821: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204726.97832: getting variables 43681 1727204726.97833: in VariableManager get_vars() 43681 1727204726.97848: Calling all_inventory to load vars for managed-node3 43681 1727204726.97851: Calling groups_inventory to load vars for managed-node3 43681 1727204726.97854: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204726.97860: Calling all_plugins_play to load vars for managed-node3 43681 1727204726.97863: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204726.97867: Calling groups_plugins_play to load vars for managed-node3 43681 1727204726.99997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204727.02802: done with get_vars() 43681 1727204727.02831: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:27 -0400 (0:00:00.143) 0:00:34.695 ***** 43681 1727204727.02898: entering _queue_task() for managed-node3/setup 43681 1727204727.03185: worker is 1 (out of 1 available) 43681 1727204727.03203: exiting _queue_task() for managed-node3/setup 43681 1727204727.03217: done queuing things up, now waiting for results queue to drain 43681 1727204727.03219: waiting for pending results... 43681 1727204727.03419: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 43681 1727204727.03520: in run() - task 12b410aa-8751-9e86-7728-0000000005de 43681 1727204727.03536: variable 'ansible_search_path' from source: unknown 43681 1727204727.03539: variable 'ansible_search_path' from source: unknown 43681 1727204727.03599: calling self._execute() 43681 1727204727.03685: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.03695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.03707: variable 'omit' from source: magic vars 43681 1727204727.04046: variable 'ansible_distribution_major_version' from source: facts 43681 1727204727.04057: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204727.04298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204727.06502: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204727.06562: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204727.06595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204727.06627: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204727.06651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204727.06728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204727.06752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204727.06777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204727.06817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204727.06831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204727.06879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204727.06905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204727.06998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204727.07001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204727.07004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204727.07115: variable '__network_required_facts' from source: role '' defaults 43681 1727204727.07126: variable 'ansible_facts' from source: unknown 43681 1727204727.07994: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 43681 1727204727.07998: when evaluation is False, skipping this task 43681 1727204727.08001: _execute() done 43681 1727204727.08003: dumping result to json 43681 1727204727.08005: done dumping result, returning 43681 1727204727.08008: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9e86-7728-0000000005de] 43681 1727204727.08011: sending task result for task 12b410aa-8751-9e86-7728-0000000005de skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204727.08325: no more pending results, returning what we have 43681 1727204727.08329: results queue empty 43681 1727204727.08330: checking for any_errors_fatal 43681 1727204727.08332: done checking for any_errors_fatal 43681 1727204727.08333: checking for max_fail_percentage 43681 1727204727.08336: done checking for max_fail_percentage 43681 1727204727.08337: checking to see if all hosts have failed and the running result is not ok 43681 1727204727.08338: done checking to see if all hosts have failed 43681 1727204727.08338: getting the remaining hosts for this loop 43681 1727204727.08340: done getting the remaining hosts for this loop 43681 1727204727.08344: getting the next task for host managed-node3 43681 1727204727.08351: done getting next task for host managed-node3 43681 1727204727.08356: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204727.08359: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204727.08395: getting variables 43681 1727204727.08397: in VariableManager get_vars() 43681 1727204727.08438: Calling all_inventory to load vars for managed-node3 43681 1727204727.08441: Calling groups_inventory to load vars for managed-node3 43681 1727204727.08444: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204727.08455: Calling all_plugins_play to load vars for managed-node3 43681 1727204727.08458: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204727.08463: Calling groups_plugins_play to load vars for managed-node3 43681 1727204727.09113: done sending task result for task 12b410aa-8751-9e86-7728-0000000005de 43681 1727204727.09116: WORKER PROCESS EXITING 43681 1727204727.10925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204727.14456: done with get_vars() 43681 1727204727.14505: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:27 -0400 (0:00:00.117) 0:00:34.812 ***** 43681 1727204727.14626: entering _queue_task() for managed-node3/stat 43681 1727204727.15273: worker is 1 (out of 1 available) 43681 1727204727.15284: exiting _queue_task() for managed-node3/stat 43681 1727204727.15329: done queuing things up, now waiting for results queue to drain 43681 1727204727.15332: waiting for pending results... 43681 1727204727.15571: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 43681 1727204727.15776: in run() - task 12b410aa-8751-9e86-7728-0000000005e0 43681 1727204727.15781: variable 'ansible_search_path' from source: unknown 43681 1727204727.15784: variable 'ansible_search_path' from source: unknown 43681 1727204727.15797: calling self._execute() 43681 1727204727.15919: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.15968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.16001: variable 'omit' from source: magic vars 43681 1727204727.16856: variable 'ansible_distribution_major_version' from source: facts 43681 1727204727.16986: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204727.17108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204727.17621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204727.17715: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204727.17840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204727.17895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204727.18004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204727.18043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204727.18119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204727.18166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204727.18276: variable '__network_is_ostree' from source: set_fact 43681 1727204727.18293: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204727.18303: when evaluation is False, skipping this task 43681 1727204727.18495: _execute() done 43681 1727204727.18499: dumping result to json 43681 1727204727.18501: done dumping result, returning 43681 1727204727.18504: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9e86-7728-0000000005e0] 43681 1727204727.18507: sending task result for task 12b410aa-8751-9e86-7728-0000000005e0 43681 1727204727.18577: done sending task result for task 12b410aa-8751-9e86-7728-0000000005e0 43681 1727204727.18580: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204727.18642: no more pending results, returning what we have 43681 1727204727.18648: results queue empty 43681 1727204727.18649: checking for any_errors_fatal 43681 1727204727.18658: done checking for any_errors_fatal 43681 1727204727.18659: checking for max_fail_percentage 43681 1727204727.18661: done checking for max_fail_percentage 43681 1727204727.18663: checking to see if all hosts have failed and the running result is not ok 43681 1727204727.18664: done checking to see if all hosts have failed 43681 1727204727.18665: getting the remaining hosts for this loop 43681 1727204727.18667: done getting the remaining hosts for this loop 43681 1727204727.18672: getting the next task for host managed-node3 43681 1727204727.18680: done getting next task for host managed-node3 43681 1727204727.18684: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204727.18688: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204727.18707: getting variables 43681 1727204727.18709: in VariableManager get_vars() 43681 1727204727.18754: Calling all_inventory to load vars for managed-node3 43681 1727204727.18757: Calling groups_inventory to load vars for managed-node3 43681 1727204727.18761: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204727.18773: Calling all_plugins_play to load vars for managed-node3 43681 1727204727.18777: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204727.18781: Calling groups_plugins_play to load vars for managed-node3 43681 1727204727.22658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204727.26412: done with get_vars() 43681 1727204727.26461: done getting variables 43681 1727204727.26535: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:27 -0400 (0:00:00.119) 0:00:34.932 ***** 43681 1727204727.26577: entering _queue_task() for managed-node3/set_fact 43681 1727204727.26964: worker is 1 (out of 1 available) 43681 1727204727.26978: exiting _queue_task() for managed-node3/set_fact 43681 1727204727.27096: done queuing things up, now waiting for results queue to drain 43681 1727204727.27098: waiting for pending results... 43681 1727204727.27410: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 43681 1727204727.27476: in run() - task 12b410aa-8751-9e86-7728-0000000005e1 43681 1727204727.27502: variable 'ansible_search_path' from source: unknown 43681 1727204727.27514: variable 'ansible_search_path' from source: unknown 43681 1727204727.27561: calling self._execute() 43681 1727204727.27727: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.27731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.27734: variable 'omit' from source: magic vars 43681 1727204727.28162: variable 'ansible_distribution_major_version' from source: facts 43681 1727204727.28181: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204727.28409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204727.28734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204727.28796: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204727.28846: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204727.28914: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204727.28998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204727.29039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204727.29195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204727.29198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204727.29222: variable '__network_is_ostree' from source: set_fact 43681 1727204727.29236: Evaluated conditional (not __network_is_ostree is defined): False 43681 1727204727.29244: when evaluation is False, skipping this task 43681 1727204727.29252: _execute() done 43681 1727204727.29261: dumping result to json 43681 1727204727.29271: done dumping result, returning 43681 1727204727.29283: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9e86-7728-0000000005e1] 43681 1727204727.29299: sending task result for task 12b410aa-8751-9e86-7728-0000000005e1 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 43681 1727204727.29470: no more pending results, returning what we have 43681 1727204727.29475: results queue empty 43681 1727204727.29476: checking for any_errors_fatal 43681 1727204727.29482: done checking for any_errors_fatal 43681 1727204727.29483: checking for max_fail_percentage 43681 1727204727.29485: done checking for max_fail_percentage 43681 1727204727.29486: checking to see if all hosts have failed and the running result is not ok 43681 1727204727.29487: done checking to see if all hosts have failed 43681 1727204727.29488: getting the remaining hosts for this loop 43681 1727204727.29492: done getting the remaining hosts for this loop 43681 1727204727.29497: getting the next task for host managed-node3 43681 1727204727.29509: done getting next task for host managed-node3 43681 1727204727.29514: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204727.29518: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204727.29534: getting variables 43681 1727204727.29536: in VariableManager get_vars() 43681 1727204727.29582: Calling all_inventory to load vars for managed-node3 43681 1727204727.29586: Calling groups_inventory to load vars for managed-node3 43681 1727204727.29692: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204727.29706: Calling all_plugins_play to load vars for managed-node3 43681 1727204727.29711: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204727.29715: Calling groups_plugins_play to load vars for managed-node3 43681 1727204727.30407: done sending task result for task 12b410aa-8751-9e86-7728-0000000005e1 43681 1727204727.30410: WORKER PROCESS EXITING 43681 1727204727.32202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204727.35258: done with get_vars() 43681 1727204727.35302: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:27 -0400 (0:00:00.088) 0:00:35.020 ***** 43681 1727204727.35423: entering _queue_task() for managed-node3/service_facts 43681 1727204727.35905: worker is 1 (out of 1 available) 43681 1727204727.35917: exiting _queue_task() for managed-node3/service_facts 43681 1727204727.35928: done queuing things up, now waiting for results queue to drain 43681 1727204727.35930: waiting for pending results... 43681 1727204727.36142: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 43681 1727204727.36297: in run() - task 12b410aa-8751-9e86-7728-0000000005e3 43681 1727204727.36321: variable 'ansible_search_path' from source: unknown 43681 1727204727.36331: variable 'ansible_search_path' from source: unknown 43681 1727204727.36381: calling self._execute() 43681 1727204727.36500: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.36515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.36533: variable 'omit' from source: magic vars 43681 1727204727.36977: variable 'ansible_distribution_major_version' from source: facts 43681 1727204727.37000: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204727.37014: variable 'omit' from source: magic vars 43681 1727204727.37099: variable 'omit' from source: magic vars 43681 1727204727.37147: variable 'omit' from source: magic vars 43681 1727204727.37205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204727.37253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204727.37283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204727.37317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204727.37335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204727.37375: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204727.37385: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.37398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.37534: Set connection var ansible_shell_type to sh 43681 1727204727.37548: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204727.37562: Set connection var ansible_timeout to 10 43681 1727204727.37577: Set connection var ansible_pipelining to False 43681 1727204727.37588: Set connection var ansible_connection to ssh 43681 1727204727.37603: Set connection var ansible_shell_executable to /bin/sh 43681 1727204727.37641: variable 'ansible_shell_executable' from source: unknown 43681 1727204727.37652: variable 'ansible_connection' from source: unknown 43681 1727204727.37661: variable 'ansible_module_compression' from source: unknown 43681 1727204727.37669: variable 'ansible_shell_type' from source: unknown 43681 1727204727.37676: variable 'ansible_shell_executable' from source: unknown 43681 1727204727.37684: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204727.37695: variable 'ansible_pipelining' from source: unknown 43681 1727204727.37703: variable 'ansible_timeout' from source: unknown 43681 1727204727.37713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204727.37956: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204727.37980: variable 'omit' from source: magic vars 43681 1727204727.37995: starting attempt loop 43681 1727204727.38073: running the handler 43681 1727204727.38077: _low_level_execute_command(): starting 43681 1727204727.38080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204727.38791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204727.38807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204727.38845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.38865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204727.38908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.38984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204727.39007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204727.39038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204727.39118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204727.40891: stdout chunk (state=3): >>>/root <<< 43681 1727204727.41085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204727.41090: stdout chunk (state=3): >>><<< 43681 1727204727.41093: stderr chunk (state=3): >>><<< 43681 1727204727.41216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204727.41220: _low_level_execute_command(): starting 43681 1727204727.41223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985 `" && echo ansible-tmp-1727204727.4111807-45523-277172672290985="` echo /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985 `" ) && sleep 0' 43681 1727204727.41775: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204727.41803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204727.41851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204727.41945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.41974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204727.41986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204727.42066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204727.44024: stdout chunk (state=3): >>>ansible-tmp-1727204727.4111807-45523-277172672290985=/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985 <<< 43681 1727204727.44212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204727.44231: stdout chunk (state=3): >>><<< 43681 1727204727.44245: stderr chunk (state=3): >>><<< 43681 1727204727.44266: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204727.4111807-45523-277172672290985=/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204727.44394: variable 'ansible_module_compression' from source: unknown 43681 1727204727.44398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 43681 1727204727.44439: variable 'ansible_facts' from source: unknown 43681 1727204727.44553: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py 43681 1727204727.44828: Sending initial data 43681 1727204727.44832: Sent initial data (162 bytes) 43681 1727204727.45393: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204727.45577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204727.45582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204727.47229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204727.47266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204727.47335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp1w4snwqf /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py <<< 43681 1727204727.47338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py" <<< 43681 1727204727.47382: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp1w4snwqf" to remote "/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py" <<< 43681 1727204727.48562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204727.48603: stderr chunk (state=3): >>><<< 43681 1727204727.48613: stdout chunk (state=3): >>><<< 43681 1727204727.48642: done transferring module to remote 43681 1727204727.48697: _low_level_execute_command(): starting 43681 1727204727.48705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/ /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py && sleep 0' 43681 1727204727.49407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204727.49411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204727.49423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204727.49480: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.49532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204727.49561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204727.49605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204727.49671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204727.51510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204727.51565: stderr chunk (state=3): >>><<< 43681 1727204727.51573: stdout chunk (state=3): >>><<< 43681 1727204727.51580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204727.51584: _low_level_execute_command(): starting 43681 1727204727.51593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/AnsiballZ_service_facts.py && sleep 0' 43681 1727204727.52037: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204727.52041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.52044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204727.52046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204727.52096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204727.52105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204727.52146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.48305: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.se<<< 43681 1727204729.48367: stdout chunk (state=3): >>>rvice", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 43681 1727204729.50015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.50030: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204729.50220: stderr chunk (state=3): >>><<< 43681 1727204729.50228: stdout chunk (state=3): >>><<< 43681 1727204729.50258: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204729.51633: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204729.51638: _low_level_execute_command(): starting 43681 1727204729.51717: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204727.4111807-45523-277172672290985/ > /dev/null 2>&1 && sleep 0' 43681 1727204729.52257: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204729.52261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204729.52264: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204729.52266: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204729.52325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204729.52350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.52407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.54303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.54358: stderr chunk (state=3): >>><<< 43681 1727204729.54362: stdout chunk (state=3): >>><<< 43681 1727204729.54375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204729.54382: handler run complete 43681 1727204729.54555: variable 'ansible_facts' from source: unknown 43681 1727204729.54709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204729.55147: variable 'ansible_facts' from source: unknown 43681 1727204729.55270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204729.55468: attempt loop complete, returning result 43681 1727204729.55475: _execute() done 43681 1727204729.55478: dumping result to json 43681 1727204729.55530: done dumping result, returning 43681 1727204729.55539: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9e86-7728-0000000005e3] 43681 1727204729.55545: sending task result for task 12b410aa-8751-9e86-7728-0000000005e3 43681 1727204729.56662: done sending task result for task 12b410aa-8751-9e86-7728-0000000005e3 43681 1727204729.56666: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204729.56744: no more pending results, returning what we have 43681 1727204729.56748: results queue empty 43681 1727204729.56749: checking for any_errors_fatal 43681 1727204729.56753: done checking for any_errors_fatal 43681 1727204729.56754: checking for max_fail_percentage 43681 1727204729.56756: done checking for max_fail_percentage 43681 1727204729.56757: checking to see if all hosts have failed and the running result is not ok 43681 1727204729.56758: done checking to see if all hosts have failed 43681 1727204729.56758: getting the remaining hosts for this loop 43681 1727204729.56760: done getting the remaining hosts for this loop 43681 1727204729.56764: getting the next task for host managed-node3 43681 1727204729.56769: done getting next task for host managed-node3 43681 1727204729.56773: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204729.56777: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204729.56791: getting variables 43681 1727204729.56793: in VariableManager get_vars() 43681 1727204729.56825: Calling all_inventory to load vars for managed-node3 43681 1727204729.56828: Calling groups_inventory to load vars for managed-node3 43681 1727204729.56831: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204729.56841: Calling all_plugins_play to load vars for managed-node3 43681 1727204729.56844: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204729.56848: Calling groups_plugins_play to load vars for managed-node3 43681 1727204729.58512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204729.60403: done with get_vars() 43681 1727204729.60428: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:29 -0400 (0:00:02.250) 0:00:37.271 ***** 43681 1727204729.60514: entering _queue_task() for managed-node3/package_facts 43681 1727204729.60866: worker is 1 (out of 1 available) 43681 1727204729.60880: exiting _queue_task() for managed-node3/package_facts 43681 1727204729.60896: done queuing things up, now waiting for results queue to drain 43681 1727204729.60898: waiting for pending results... 43681 1727204729.61171: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 43681 1727204729.61285: in run() - task 12b410aa-8751-9e86-7728-0000000005e4 43681 1727204729.61302: variable 'ansible_search_path' from source: unknown 43681 1727204729.61306: variable 'ansible_search_path' from source: unknown 43681 1727204729.61340: calling self._execute() 43681 1727204729.61465: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204729.61477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204729.61484: variable 'omit' from source: magic vars 43681 1727204729.61822: variable 'ansible_distribution_major_version' from source: facts 43681 1727204729.61835: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204729.61842: variable 'omit' from source: magic vars 43681 1727204729.61894: variable 'omit' from source: magic vars 43681 1727204729.61930: variable 'omit' from source: magic vars 43681 1727204729.61970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204729.62026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204729.62043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204729.62060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204729.62070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204729.62100: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204729.62104: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204729.62108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204729.62195: Set connection var ansible_shell_type to sh 43681 1727204729.62202: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204729.62209: Set connection var ansible_timeout to 10 43681 1727204729.62217: Set connection var ansible_pipelining to False 43681 1727204729.62229: Set connection var ansible_connection to ssh 43681 1727204729.62232: Set connection var ansible_shell_executable to /bin/sh 43681 1727204729.62256: variable 'ansible_shell_executable' from source: unknown 43681 1727204729.62259: variable 'ansible_connection' from source: unknown 43681 1727204729.62262: variable 'ansible_module_compression' from source: unknown 43681 1727204729.62267: variable 'ansible_shell_type' from source: unknown 43681 1727204729.62269: variable 'ansible_shell_executable' from source: unknown 43681 1727204729.62274: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204729.62279: variable 'ansible_pipelining' from source: unknown 43681 1727204729.62283: variable 'ansible_timeout' from source: unknown 43681 1727204729.62288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204729.62464: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204729.62474: variable 'omit' from source: magic vars 43681 1727204729.62480: starting attempt loop 43681 1727204729.62483: running the handler 43681 1727204729.62499: _low_level_execute_command(): starting 43681 1727204729.62506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204729.63235: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204729.63288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204729.63295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.63345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.65113: stdout chunk (state=3): >>>/root <<< 43681 1727204729.65236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.65340: stderr chunk (state=3): >>><<< 43681 1727204729.65344: stdout chunk (state=3): >>><<< 43681 1727204729.65377: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204729.65388: _low_level_execute_command(): starting 43681 1727204729.65397: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670 `" && echo ansible-tmp-1727204729.6537492-45568-44753510047670="` echo /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670 `" ) && sleep 0' 43681 1727204729.66026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204729.66030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204729.66070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204729.66073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.66119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.68159: stdout chunk (state=3): >>>ansible-tmp-1727204729.6537492-45568-44753510047670=/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670 <<< 43681 1727204729.68247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.68294: stderr chunk (state=3): >>><<< 43681 1727204729.68299: stdout chunk (state=3): >>><<< 43681 1727204729.68315: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204729.6537492-45568-44753510047670=/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204729.68356: variable 'ansible_module_compression' from source: unknown 43681 1727204729.68394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 43681 1727204729.68450: variable 'ansible_facts' from source: unknown 43681 1727204729.68694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py 43681 1727204729.68932: Sending initial data 43681 1727204729.69301: Sent initial data (161 bytes) 43681 1727204729.69919: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204729.69939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204729.69952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204729.70010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204729.70027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.70065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.71721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204729.71728: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204729.71749: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204729.71783: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp5388a2sl /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py <<< 43681 1727204729.71793: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py" <<< 43681 1727204729.71834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp5388a2sl" to remote "/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py" <<< 43681 1727204729.74413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.74465: stderr chunk (state=3): >>><<< 43681 1727204729.74476: stdout chunk (state=3): >>><<< 43681 1727204729.74613: done transferring module to remote 43681 1727204729.74617: _low_level_execute_command(): starting 43681 1727204729.74620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/ /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py && sleep 0' 43681 1727204729.75175: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204729.75194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204729.75310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204729.75335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204729.75352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.75419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204729.77436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204729.77618: stderr chunk (state=3): >>><<< 43681 1727204729.77625: stdout chunk (state=3): >>><<< 43681 1727204729.77632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204729.77634: _low_level_execute_command(): starting 43681 1727204729.77637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/AnsiballZ_package_facts.py && sleep 0' 43681 1727204729.78296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204729.78312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204729.78621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204729.78643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204729.78724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204730.42907: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 43681 1727204730.42928: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 43681 1727204730.42962: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 43681 1727204730.42976: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 43681 1727204730.42979: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 43681 1727204730.43034: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 43681 1727204730.43050: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 43681 1727204730.43079: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 43681 1727204730.43087: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 43681 1727204730.43108: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 43681 1727204730.43137: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 43681 1727204730.43155: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 43681 1727204730.43177: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 43681 1727204730.43191: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 43681 1727204730.45097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204730.45162: stderr chunk (state=3): >>><<< 43681 1727204730.45166: stdout chunk (state=3): >>><<< 43681 1727204730.45223: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204730.47465: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204730.47486: _low_level_execute_command(): starting 43681 1727204730.47493: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204729.6537492-45568-44753510047670/ > /dev/null 2>&1 && sleep 0' 43681 1727204730.47995: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204730.47998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204730.48001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204730.48004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204730.48063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204730.48071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204730.48073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204730.48109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204730.50033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204730.50094: stderr chunk (state=3): >>><<< 43681 1727204730.50098: stdout chunk (state=3): >>><<< 43681 1727204730.50112: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204730.50120: handler run complete 43681 1727204730.50961: variable 'ansible_facts' from source: unknown 43681 1727204730.51423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.53436: variable 'ansible_facts' from source: unknown 43681 1727204730.53938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.54713: attempt loop complete, returning result 43681 1727204730.54731: _execute() done 43681 1727204730.54734: dumping result to json 43681 1727204730.54912: done dumping result, returning 43681 1727204730.54926: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9e86-7728-0000000005e4] 43681 1727204730.54933: sending task result for task 12b410aa-8751-9e86-7728-0000000005e4 43681 1727204730.56958: done sending task result for task 12b410aa-8751-9e86-7728-0000000005e4 43681 1727204730.56962: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204730.57070: no more pending results, returning what we have 43681 1727204730.57072: results queue empty 43681 1727204730.57073: checking for any_errors_fatal 43681 1727204730.57077: done checking for any_errors_fatal 43681 1727204730.57078: checking for max_fail_percentage 43681 1727204730.57079: done checking for max_fail_percentage 43681 1727204730.57080: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.57081: done checking to see if all hosts have failed 43681 1727204730.57081: getting the remaining hosts for this loop 43681 1727204730.57082: done getting the remaining hosts for this loop 43681 1727204730.57085: getting the next task for host managed-node3 43681 1727204730.57092: done getting next task for host managed-node3 43681 1727204730.57095: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204730.57097: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.57104: getting variables 43681 1727204730.57105: in VariableManager get_vars() 43681 1727204730.57132: Calling all_inventory to load vars for managed-node3 43681 1727204730.57134: Calling groups_inventory to load vars for managed-node3 43681 1727204730.57135: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.57143: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.57145: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.57147: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.58391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.59998: done with get_vars() 43681 1727204730.60022: done getting variables 43681 1727204730.60072: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.995) 0:00:38.267 ***** 43681 1727204730.60103: entering _queue_task() for managed-node3/debug 43681 1727204730.60366: worker is 1 (out of 1 available) 43681 1727204730.60381: exiting _queue_task() for managed-node3/debug 43681 1727204730.60394: done queuing things up, now waiting for results queue to drain 43681 1727204730.60396: waiting for pending results... 43681 1727204730.60594: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 43681 1727204730.60673: in run() - task 12b410aa-8751-9e86-7728-000000000092 43681 1727204730.60687: variable 'ansible_search_path' from source: unknown 43681 1727204730.60692: variable 'ansible_search_path' from source: unknown 43681 1727204730.60726: calling self._execute() 43681 1727204730.60815: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.60822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.60835: variable 'omit' from source: magic vars 43681 1727204730.61167: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.61178: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204730.61185: variable 'omit' from source: magic vars 43681 1727204730.61218: variable 'omit' from source: magic vars 43681 1727204730.61304: variable 'network_provider' from source: set_fact 43681 1727204730.61319: variable 'omit' from source: magic vars 43681 1727204730.61358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204730.61389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204730.61410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204730.61429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204730.61441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204730.61468: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204730.61471: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.61475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.61564: Set connection var ansible_shell_type to sh 43681 1727204730.61570: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204730.61577: Set connection var ansible_timeout to 10 43681 1727204730.61585: Set connection var ansible_pipelining to False 43681 1727204730.61593: Set connection var ansible_connection to ssh 43681 1727204730.61600: Set connection var ansible_shell_executable to /bin/sh 43681 1727204730.61623: variable 'ansible_shell_executable' from source: unknown 43681 1727204730.61627: variable 'ansible_connection' from source: unknown 43681 1727204730.61634: variable 'ansible_module_compression' from source: unknown 43681 1727204730.61636: variable 'ansible_shell_type' from source: unknown 43681 1727204730.61639: variable 'ansible_shell_executable' from source: unknown 43681 1727204730.61643: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.61649: variable 'ansible_pipelining' from source: unknown 43681 1727204730.61651: variable 'ansible_timeout' from source: unknown 43681 1727204730.61657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.61780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204730.61793: variable 'omit' from source: magic vars 43681 1727204730.61799: starting attempt loop 43681 1727204730.61802: running the handler 43681 1727204730.61849: handler run complete 43681 1727204730.61864: attempt loop complete, returning result 43681 1727204730.61868: _execute() done 43681 1727204730.61870: dumping result to json 43681 1727204730.61875: done dumping result, returning 43681 1727204730.61882: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9e86-7728-000000000092] 43681 1727204730.61892: sending task result for task 12b410aa-8751-9e86-7728-000000000092 43681 1727204730.61983: done sending task result for task 12b410aa-8751-9e86-7728-000000000092 43681 1727204730.61986: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 43681 1727204730.62054: no more pending results, returning what we have 43681 1727204730.62058: results queue empty 43681 1727204730.62059: checking for any_errors_fatal 43681 1727204730.62070: done checking for any_errors_fatal 43681 1727204730.62071: checking for max_fail_percentage 43681 1727204730.62072: done checking for max_fail_percentage 43681 1727204730.62074: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.62075: done checking to see if all hosts have failed 43681 1727204730.62075: getting the remaining hosts for this loop 43681 1727204730.62077: done getting the remaining hosts for this loop 43681 1727204730.62082: getting the next task for host managed-node3 43681 1727204730.62088: done getting next task for host managed-node3 43681 1727204730.62098: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204730.62105: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.62116: getting variables 43681 1727204730.62118: in VariableManager get_vars() 43681 1727204730.62153: Calling all_inventory to load vars for managed-node3 43681 1727204730.62156: Calling groups_inventory to load vars for managed-node3 43681 1727204730.62159: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.62169: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.62172: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.62175: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.63424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.65037: done with get_vars() 43681 1727204730.65064: done getting variables 43681 1727204730.65125: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.050) 0:00:38.318 ***** 43681 1727204730.65155: entering _queue_task() for managed-node3/fail 43681 1727204730.65426: worker is 1 (out of 1 available) 43681 1727204730.65442: exiting _queue_task() for managed-node3/fail 43681 1727204730.65454: done queuing things up, now waiting for results queue to drain 43681 1727204730.65457: waiting for pending results... 43681 1727204730.65651: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 43681 1727204730.65737: in run() - task 12b410aa-8751-9e86-7728-000000000093 43681 1727204730.65750: variable 'ansible_search_path' from source: unknown 43681 1727204730.65753: variable 'ansible_search_path' from source: unknown 43681 1727204730.65788: calling self._execute() 43681 1727204730.65874: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.65880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.65893: variable 'omit' from source: magic vars 43681 1727204730.66241: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.66245: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204730.66344: variable 'network_state' from source: role '' defaults 43681 1727204730.66362: Evaluated conditional (network_state != {}): False 43681 1727204730.66365: when evaluation is False, skipping this task 43681 1727204730.66369: _execute() done 43681 1727204730.66372: dumping result to json 43681 1727204730.66375: done dumping result, returning 43681 1727204730.66378: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9e86-7728-000000000093] 43681 1727204730.66383: sending task result for task 12b410aa-8751-9e86-7728-000000000093 43681 1727204730.66479: done sending task result for task 12b410aa-8751-9e86-7728-000000000093 43681 1727204730.66482: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204730.66540: no more pending results, returning what we have 43681 1727204730.66545: results queue empty 43681 1727204730.66546: checking for any_errors_fatal 43681 1727204730.66554: done checking for any_errors_fatal 43681 1727204730.66555: checking for max_fail_percentage 43681 1727204730.66557: done checking for max_fail_percentage 43681 1727204730.66558: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.66559: done checking to see if all hosts have failed 43681 1727204730.66560: getting the remaining hosts for this loop 43681 1727204730.66562: done getting the remaining hosts for this loop 43681 1727204730.66566: getting the next task for host managed-node3 43681 1727204730.66571: done getting next task for host managed-node3 43681 1727204730.66576: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204730.66578: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.66601: getting variables 43681 1727204730.66603: in VariableManager get_vars() 43681 1727204730.66640: Calling all_inventory to load vars for managed-node3 43681 1727204730.66642: Calling groups_inventory to load vars for managed-node3 43681 1727204730.66645: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.66655: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.66658: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.66661: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.71446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.73037: done with get_vars() 43681 1727204730.73063: done getting variables 43681 1727204730.73108: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.079) 0:00:38.397 ***** 43681 1727204730.73131: entering _queue_task() for managed-node3/fail 43681 1727204730.73411: worker is 1 (out of 1 available) 43681 1727204730.73426: exiting _queue_task() for managed-node3/fail 43681 1727204730.73439: done queuing things up, now waiting for results queue to drain 43681 1727204730.73441: waiting for pending results... 43681 1727204730.73649: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 43681 1727204730.73736: in run() - task 12b410aa-8751-9e86-7728-000000000094 43681 1727204730.73749: variable 'ansible_search_path' from source: unknown 43681 1727204730.73753: variable 'ansible_search_path' from source: unknown 43681 1727204730.73786: calling self._execute() 43681 1727204730.73876: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.73885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.73903: variable 'omit' from source: magic vars 43681 1727204730.74245: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.74254: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204730.74364: variable 'network_state' from source: role '' defaults 43681 1727204730.74375: Evaluated conditional (network_state != {}): False 43681 1727204730.74378: when evaluation is False, skipping this task 43681 1727204730.74381: _execute() done 43681 1727204730.74384: dumping result to json 43681 1727204730.74391: done dumping result, returning 43681 1727204730.74399: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9e86-7728-000000000094] 43681 1727204730.74406: sending task result for task 12b410aa-8751-9e86-7728-000000000094 43681 1727204730.74504: done sending task result for task 12b410aa-8751-9e86-7728-000000000094 43681 1727204730.74509: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204730.74560: no more pending results, returning what we have 43681 1727204730.74565: results queue empty 43681 1727204730.74566: checking for any_errors_fatal 43681 1727204730.74576: done checking for any_errors_fatal 43681 1727204730.74577: checking for max_fail_percentage 43681 1727204730.74579: done checking for max_fail_percentage 43681 1727204730.74580: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.74581: done checking to see if all hosts have failed 43681 1727204730.74582: getting the remaining hosts for this loop 43681 1727204730.74584: done getting the remaining hosts for this loop 43681 1727204730.74591: getting the next task for host managed-node3 43681 1727204730.74596: done getting next task for host managed-node3 43681 1727204730.74601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204730.74603: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.74619: getting variables 43681 1727204730.74623: in VariableManager get_vars() 43681 1727204730.74659: Calling all_inventory to load vars for managed-node3 43681 1727204730.74662: Calling groups_inventory to load vars for managed-node3 43681 1727204730.74664: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.74675: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.74678: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.74681: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.75942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.77671: done with get_vars() 43681 1727204730.77695: done getting variables 43681 1727204730.77751: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.046) 0:00:38.444 ***** 43681 1727204730.77774: entering _queue_task() for managed-node3/fail 43681 1727204730.78040: worker is 1 (out of 1 available) 43681 1727204730.78054: exiting _queue_task() for managed-node3/fail 43681 1727204730.78067: done queuing things up, now waiting for results queue to drain 43681 1727204730.78069: waiting for pending results... 43681 1727204730.78263: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 43681 1727204730.78352: in run() - task 12b410aa-8751-9e86-7728-000000000095 43681 1727204730.78367: variable 'ansible_search_path' from source: unknown 43681 1727204730.78370: variable 'ansible_search_path' from source: unknown 43681 1727204730.78412: calling self._execute() 43681 1727204730.78497: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.78504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.78518: variable 'omit' from source: magic vars 43681 1727204730.78846: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.78855: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204730.79010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204730.80814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204730.80877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204730.80912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204730.80946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204730.80969: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204730.81046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204730.81072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204730.81097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.81131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204730.81147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204730.81235: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.81251: Evaluated conditional (ansible_distribution_major_version | int > 9): True 43681 1727204730.81353: variable 'ansible_distribution' from source: facts 43681 1727204730.81357: variable '__network_rh_distros' from source: role '' defaults 43681 1727204730.81367: Evaluated conditional (ansible_distribution in __network_rh_distros): False 43681 1727204730.81371: when evaluation is False, skipping this task 43681 1727204730.81374: _execute() done 43681 1727204730.81380: dumping result to json 43681 1727204730.81384: done dumping result, returning 43681 1727204730.81393: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9e86-7728-000000000095] 43681 1727204730.81399: sending task result for task 12b410aa-8751-9e86-7728-000000000095 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 43681 1727204730.81614: no more pending results, returning what we have 43681 1727204730.81618: results queue empty 43681 1727204730.81619: checking for any_errors_fatal 43681 1727204730.81628: done checking for any_errors_fatal 43681 1727204730.81629: checking for max_fail_percentage 43681 1727204730.81631: done checking for max_fail_percentage 43681 1727204730.81632: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.81633: done checking to see if all hosts have failed 43681 1727204730.81634: getting the remaining hosts for this loop 43681 1727204730.81636: done getting the remaining hosts for this loop 43681 1727204730.81640: getting the next task for host managed-node3 43681 1727204730.81647: done getting next task for host managed-node3 43681 1727204730.81652: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204730.81654: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.81667: getting variables 43681 1727204730.81669: in VariableManager get_vars() 43681 1727204730.81708: Calling all_inventory to load vars for managed-node3 43681 1727204730.81711: Calling groups_inventory to load vars for managed-node3 43681 1727204730.81714: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.81726: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.81729: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.81732: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.82813: done sending task result for task 12b410aa-8751-9e86-7728-000000000095 43681 1727204730.82817: WORKER PROCESS EXITING 43681 1727204730.83078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.85557: done with get_vars() 43681 1727204730.85597: done getting variables 43681 1727204730.85657: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.079) 0:00:38.523 ***** 43681 1727204730.85683: entering _queue_task() for managed-node3/dnf 43681 1727204730.85957: worker is 1 (out of 1 available) 43681 1727204730.85970: exiting _queue_task() for managed-node3/dnf 43681 1727204730.85982: done queuing things up, now waiting for results queue to drain 43681 1727204730.85984: waiting for pending results... 43681 1727204730.86176: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 43681 1727204730.86261: in run() - task 12b410aa-8751-9e86-7728-000000000096 43681 1727204730.86275: variable 'ansible_search_path' from source: unknown 43681 1727204730.86279: variable 'ansible_search_path' from source: unknown 43681 1727204730.86312: calling self._execute() 43681 1727204730.86400: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.86407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.86417: variable 'omit' from source: magic vars 43681 1727204730.86745: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.86756: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204730.86934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204730.89696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204730.89700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204730.89752: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204730.89802: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204730.89838: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204730.89954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204730.89996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204730.90030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.90081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204730.90104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204730.90244: variable 'ansible_distribution' from source: facts 43681 1727204730.90254: variable 'ansible_distribution_major_version' from source: facts 43681 1727204730.90266: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 43681 1727204730.90410: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204730.90587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204730.90623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204730.90794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.90798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204730.90801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204730.90804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204730.90817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204730.90852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.90907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204730.90928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204730.90982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204730.91019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204730.91054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.91110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204730.91132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204730.91336: variable 'network_connections' from source: play vars 43681 1727204730.91354: variable 'profile' from source: play vars 43681 1727204730.91440: variable 'profile' from source: play vars 43681 1727204730.91450: variable 'interface' from source: set_fact 43681 1727204730.91527: variable 'interface' from source: set_fact 43681 1727204730.91622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204730.91827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204730.91878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204730.91923: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204730.91977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204730.92035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204730.92067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204730.92113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204730.92294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204730.92298: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204730.92532: variable 'network_connections' from source: play vars 43681 1727204730.92544: variable 'profile' from source: play vars 43681 1727204730.92622: variable 'profile' from source: play vars 43681 1727204730.92633: variable 'interface' from source: set_fact 43681 1727204730.92710: variable 'interface' from source: set_fact 43681 1727204730.92744: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204730.92753: when evaluation is False, skipping this task 43681 1727204730.92761: _execute() done 43681 1727204730.92769: dumping result to json 43681 1727204730.92776: done dumping result, returning 43681 1727204730.92788: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000096] 43681 1727204730.92802: sending task result for task 12b410aa-8751-9e86-7728-000000000096 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204730.92969: no more pending results, returning what we have 43681 1727204730.92973: results queue empty 43681 1727204730.92974: checking for any_errors_fatal 43681 1727204730.92980: done checking for any_errors_fatal 43681 1727204730.92981: checking for max_fail_percentage 43681 1727204730.92984: done checking for max_fail_percentage 43681 1727204730.92985: checking to see if all hosts have failed and the running result is not ok 43681 1727204730.92986: done checking to see if all hosts have failed 43681 1727204730.92986: getting the remaining hosts for this loop 43681 1727204730.92988: done getting the remaining hosts for this loop 43681 1727204730.92995: getting the next task for host managed-node3 43681 1727204730.93001: done getting next task for host managed-node3 43681 1727204730.93006: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204730.93008: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204730.93025: done sending task result for task 12b410aa-8751-9e86-7728-000000000096 43681 1727204730.93029: WORKER PROCESS EXITING 43681 1727204730.93142: getting variables 43681 1727204730.93144: in VariableManager get_vars() 43681 1727204730.93186: Calling all_inventory to load vars for managed-node3 43681 1727204730.93191: Calling groups_inventory to load vars for managed-node3 43681 1727204730.93194: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204730.93204: Calling all_plugins_play to load vars for managed-node3 43681 1727204730.93208: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204730.93211: Calling groups_plugins_play to load vars for managed-node3 43681 1727204730.95661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204730.98627: done with get_vars() 43681 1727204730.98673: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 43681 1727204730.98772: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.131) 0:00:38.654 ***** 43681 1727204730.98811: entering _queue_task() for managed-node3/yum 43681 1727204730.99186: worker is 1 (out of 1 available) 43681 1727204730.99204: exiting _queue_task() for managed-node3/yum 43681 1727204730.99217: done queuing things up, now waiting for results queue to drain 43681 1727204730.99219: waiting for pending results... 43681 1727204730.99612: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 43681 1727204730.99659: in run() - task 12b410aa-8751-9e86-7728-000000000097 43681 1727204730.99682: variable 'ansible_search_path' from source: unknown 43681 1727204730.99694: variable 'ansible_search_path' from source: unknown 43681 1727204730.99744: calling self._execute() 43681 1727204730.99866: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204730.99882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204730.99903: variable 'omit' from source: magic vars 43681 1727204731.00368: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.00386: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.00625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204731.03351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204731.03445: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204731.03494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204731.03541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204731.03576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204731.03688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.03794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.03798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.03838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.03864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.03994: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.04064: Evaluated conditional (ansible_distribution_major_version | int < 8): False 43681 1727204731.04067: when evaluation is False, skipping this task 43681 1727204731.04070: _execute() done 43681 1727204731.04073: dumping result to json 43681 1727204731.04075: done dumping result, returning 43681 1727204731.04078: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000097] 43681 1727204731.04082: sending task result for task 12b410aa-8751-9e86-7728-000000000097 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 43681 1727204731.04257: no more pending results, returning what we have 43681 1727204731.04263: results queue empty 43681 1727204731.04264: checking for any_errors_fatal 43681 1727204731.04272: done checking for any_errors_fatal 43681 1727204731.04273: checking for max_fail_percentage 43681 1727204731.04275: done checking for max_fail_percentage 43681 1727204731.04277: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.04278: done checking to see if all hosts have failed 43681 1727204731.04279: getting the remaining hosts for this loop 43681 1727204731.04281: done getting the remaining hosts for this loop 43681 1727204731.04286: getting the next task for host managed-node3 43681 1727204731.04295: done getting next task for host managed-node3 43681 1727204731.04302: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204731.04304: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.04322: getting variables 43681 1727204731.04324: in VariableManager get_vars() 43681 1727204731.04371: Calling all_inventory to load vars for managed-node3 43681 1727204731.04374: Calling groups_inventory to load vars for managed-node3 43681 1727204731.04377: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.04696: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.04702: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.04708: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.05406: done sending task result for task 12b410aa-8751-9e86-7728-000000000097 43681 1727204731.05410: WORKER PROCESS EXITING 43681 1727204731.07011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.10144: done with get_vars() 43681 1727204731.10181: done getting variables 43681 1727204731.10256: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.114) 0:00:38.769 ***** 43681 1727204731.10294: entering _queue_task() for managed-node3/fail 43681 1727204731.10663: worker is 1 (out of 1 available) 43681 1727204731.10677: exiting _queue_task() for managed-node3/fail 43681 1727204731.10693: done queuing things up, now waiting for results queue to drain 43681 1727204731.10695: waiting for pending results... 43681 1727204731.11006: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 43681 1727204731.11142: in run() - task 12b410aa-8751-9e86-7728-000000000098 43681 1727204731.11166: variable 'ansible_search_path' from source: unknown 43681 1727204731.11175: variable 'ansible_search_path' from source: unknown 43681 1727204731.11222: calling self._execute() 43681 1727204731.11349: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.11365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.11382: variable 'omit' from source: magic vars 43681 1727204731.11838: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.11858: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.12028: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.12296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204731.14969: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204731.15074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204731.15125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204731.15178: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204731.15216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204731.15317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.15359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.15401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.15460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.15482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.15550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.15585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.15629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.15686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.15711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.15773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.15811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.15852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.15908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.15932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.16195: variable 'network_connections' from source: play vars 43681 1727204731.16199: variable 'profile' from source: play vars 43681 1727204731.16296: variable 'profile' from source: play vars 43681 1727204731.16308: variable 'interface' from source: set_fact 43681 1727204731.16594: variable 'interface' from source: set_fact 43681 1727204731.16598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204731.16717: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204731.16768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204731.16812: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204731.16856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204731.16915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204731.16952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204731.16988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.17028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204731.17095: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204731.17442: variable 'network_connections' from source: play vars 43681 1727204731.17454: variable 'profile' from source: play vars 43681 1727204731.17539: variable 'profile' from source: play vars 43681 1727204731.17549: variable 'interface' from source: set_fact 43681 1727204731.17632: variable 'interface' from source: set_fact 43681 1727204731.17666: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204731.17675: when evaluation is False, skipping this task 43681 1727204731.17683: _execute() done 43681 1727204731.17696: dumping result to json 43681 1727204731.17707: done dumping result, returning 43681 1727204731.17721: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-000000000098] 43681 1727204731.17742: sending task result for task 12b410aa-8751-9e86-7728-000000000098 43681 1727204731.18095: done sending task result for task 12b410aa-8751-9e86-7728-000000000098 43681 1727204731.18099: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204731.18154: no more pending results, returning what we have 43681 1727204731.18158: results queue empty 43681 1727204731.18160: checking for any_errors_fatal 43681 1727204731.18166: done checking for any_errors_fatal 43681 1727204731.18167: checking for max_fail_percentage 43681 1727204731.18169: done checking for max_fail_percentage 43681 1727204731.18170: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.18171: done checking to see if all hosts have failed 43681 1727204731.18172: getting the remaining hosts for this loop 43681 1727204731.18174: done getting the remaining hosts for this loop 43681 1727204731.18179: getting the next task for host managed-node3 43681 1727204731.18185: done getting next task for host managed-node3 43681 1727204731.18191: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 43681 1727204731.18194: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.18208: getting variables 43681 1727204731.18210: in VariableManager get_vars() 43681 1727204731.18252: Calling all_inventory to load vars for managed-node3 43681 1727204731.18255: Calling groups_inventory to load vars for managed-node3 43681 1727204731.18258: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.18270: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.18274: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.18279: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.20612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.23595: done with get_vars() 43681 1727204731.23638: done getting variables 43681 1727204731.23714: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.134) 0:00:38.904 ***** 43681 1727204731.23752: entering _queue_task() for managed-node3/package 43681 1727204731.24128: worker is 1 (out of 1 available) 43681 1727204731.24144: exiting _queue_task() for managed-node3/package 43681 1727204731.24158: done queuing things up, now waiting for results queue to drain 43681 1727204731.24160: waiting for pending results... 43681 1727204731.24468: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 43681 1727204731.24598: in run() - task 12b410aa-8751-9e86-7728-000000000099 43681 1727204731.24625: variable 'ansible_search_path' from source: unknown 43681 1727204731.24635: variable 'ansible_search_path' from source: unknown 43681 1727204731.24680: calling self._execute() 43681 1727204731.24801: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.24816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.24838: variable 'omit' from source: magic vars 43681 1727204731.25297: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.25316: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.25577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204731.25901: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204731.25968: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204731.26021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204731.26117: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204731.26268: variable 'network_packages' from source: role '' defaults 43681 1727204731.26416: variable '__network_provider_setup' from source: role '' defaults 43681 1727204731.26437: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204731.26532: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204731.26562: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204731.26635: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204731.26994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204731.29727: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204731.29807: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204731.29858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204731.29902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204731.29940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204731.30046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.30088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.30131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.30296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.30300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.30303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.30318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.30355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.30414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.30440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.30859: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204731.30924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.30982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.31021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.31082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.31109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.31230: variable 'ansible_python' from source: facts 43681 1727204731.31267: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204731.31377: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204731.31485: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204731.31665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.31702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.31744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.31799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.31823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.31893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.31951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.31994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.32044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.32167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.32267: variable 'network_connections' from source: play vars 43681 1727204731.32285: variable 'profile' from source: play vars 43681 1727204731.32418: variable 'profile' from source: play vars 43681 1727204731.32434: variable 'interface' from source: set_fact 43681 1727204731.32526: variable 'interface' from source: set_fact 43681 1727204731.32622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204731.32660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204731.32708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.32758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204731.32822: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.33220: variable 'network_connections' from source: play vars 43681 1727204731.33236: variable 'profile' from source: play vars 43681 1727204731.33353: variable 'profile' from source: play vars 43681 1727204731.33360: variable 'interface' from source: set_fact 43681 1727204731.33420: variable 'interface' from source: set_fact 43681 1727204731.33451: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204731.33526: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.33777: variable 'network_connections' from source: play vars 43681 1727204731.33782: variable 'profile' from source: play vars 43681 1727204731.33839: variable 'profile' from source: play vars 43681 1727204731.33843: variable 'interface' from source: set_fact 43681 1727204731.33962: variable 'interface' from source: set_fact 43681 1727204731.33967: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204731.34019: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204731.34269: variable 'network_connections' from source: play vars 43681 1727204731.34272: variable 'profile' from source: play vars 43681 1727204731.34333: variable 'profile' from source: play vars 43681 1727204731.34336: variable 'interface' from source: set_fact 43681 1727204731.34419: variable 'interface' from source: set_fact 43681 1727204731.34465: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204731.34518: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204731.34529: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204731.34576: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204731.34761: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204731.35897: variable 'network_connections' from source: play vars 43681 1727204731.35900: variable 'profile' from source: play vars 43681 1727204731.35987: variable 'profile' from source: play vars 43681 1727204731.36008: variable 'interface' from source: set_fact 43681 1727204731.36117: variable 'interface' from source: set_fact 43681 1727204731.36123: variable 'ansible_distribution' from source: facts 43681 1727204731.36201: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.36204: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.36207: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204731.36436: variable 'ansible_distribution' from source: facts 43681 1727204731.36455: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.36468: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.36482: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204731.36745: variable 'ansible_distribution' from source: facts 43681 1727204731.36761: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.36780: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.37004: variable 'network_provider' from source: set_fact 43681 1727204731.37008: variable 'ansible_facts' from source: unknown 43681 1727204731.38300: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 43681 1727204731.38318: when evaluation is False, skipping this task 43681 1727204731.38333: _execute() done 43681 1727204731.38341: dumping result to json 43681 1727204731.38349: done dumping result, returning 43681 1727204731.38364: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9e86-7728-000000000099] 43681 1727204731.38374: sending task result for task 12b410aa-8751-9e86-7728-000000000099 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 43681 1727204731.38660: no more pending results, returning what we have 43681 1727204731.38666: results queue empty 43681 1727204731.38668: checking for any_errors_fatal 43681 1727204731.38675: done checking for any_errors_fatal 43681 1727204731.38676: checking for max_fail_percentage 43681 1727204731.38678: done checking for max_fail_percentage 43681 1727204731.38680: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.38681: done checking to see if all hosts have failed 43681 1727204731.38682: getting the remaining hosts for this loop 43681 1727204731.38684: done getting the remaining hosts for this loop 43681 1727204731.38692: getting the next task for host managed-node3 43681 1727204731.38699: done getting next task for host managed-node3 43681 1727204731.38705: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204731.38708: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.38727: getting variables 43681 1727204731.38729: in VariableManager get_vars() 43681 1727204731.38778: Calling all_inventory to load vars for managed-node3 43681 1727204731.38782: Calling groups_inventory to load vars for managed-node3 43681 1727204731.38785: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.39018: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.39032: done sending task result for task 12b410aa-8751-9e86-7728-000000000099 43681 1727204731.39050: WORKER PROCESS EXITING 43681 1727204731.39044: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.39057: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.42055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.45467: done with get_vars() 43681 1727204731.45527: done getting variables 43681 1727204731.45606: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.219) 0:00:39.123 ***** 43681 1727204731.45656: entering _queue_task() for managed-node3/package 43681 1727204731.46308: worker is 1 (out of 1 available) 43681 1727204731.46323: exiting _queue_task() for managed-node3/package 43681 1727204731.46335: done queuing things up, now waiting for results queue to drain 43681 1727204731.46336: waiting for pending results... 43681 1727204731.46454: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 43681 1727204731.46603: in run() - task 12b410aa-8751-9e86-7728-00000000009a 43681 1727204731.46630: variable 'ansible_search_path' from source: unknown 43681 1727204731.46640: variable 'ansible_search_path' from source: unknown 43681 1727204731.46697: calling self._execute() 43681 1727204731.46898: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.46909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.46912: variable 'omit' from source: magic vars 43681 1727204731.47365: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.47384: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.47577: variable 'network_state' from source: role '' defaults 43681 1727204731.47599: Evaluated conditional (network_state != {}): False 43681 1727204731.47606: when evaluation is False, skipping this task 43681 1727204731.47614: _execute() done 43681 1727204731.47624: dumping result to json 43681 1727204731.47633: done dumping result, returning 43681 1727204731.47646: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9e86-7728-00000000009a] 43681 1727204731.47663: sending task result for task 12b410aa-8751-9e86-7728-00000000009a skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204731.47972: no more pending results, returning what we have 43681 1727204731.47978: results queue empty 43681 1727204731.47979: checking for any_errors_fatal 43681 1727204731.47990: done checking for any_errors_fatal 43681 1727204731.47991: checking for max_fail_percentage 43681 1727204731.47993: done checking for max_fail_percentage 43681 1727204731.47995: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.48000: done checking to see if all hosts have failed 43681 1727204731.48001: getting the remaining hosts for this loop 43681 1727204731.48003: done getting the remaining hosts for this loop 43681 1727204731.48008: getting the next task for host managed-node3 43681 1727204731.48016: done getting next task for host managed-node3 43681 1727204731.48024: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204731.48027: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.48045: getting variables 43681 1727204731.48047: in VariableManager get_vars() 43681 1727204731.48196: Calling all_inventory to load vars for managed-node3 43681 1727204731.48199: Calling groups_inventory to load vars for managed-node3 43681 1727204731.48208: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.48229: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.48234: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.48238: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.48841: done sending task result for task 12b410aa-8751-9e86-7728-00000000009a 43681 1727204731.48845: WORKER PROCESS EXITING 43681 1727204731.50952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.54506: done with get_vars() 43681 1727204731.54558: done getting variables 43681 1727204731.54648: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.090) 0:00:39.213 ***** 43681 1727204731.54687: entering _queue_task() for managed-node3/package 43681 1727204731.55327: worker is 1 (out of 1 available) 43681 1727204731.55339: exiting _queue_task() for managed-node3/package 43681 1727204731.55350: done queuing things up, now waiting for results queue to drain 43681 1727204731.55352: waiting for pending results... 43681 1727204731.55480: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 43681 1727204731.55627: in run() - task 12b410aa-8751-9e86-7728-00000000009b 43681 1727204731.55650: variable 'ansible_search_path' from source: unknown 43681 1727204731.55658: variable 'ansible_search_path' from source: unknown 43681 1727204731.55714: calling self._execute() 43681 1727204731.55844: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.55858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.55873: variable 'omit' from source: magic vars 43681 1727204731.56428: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.56454: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.56645: variable 'network_state' from source: role '' defaults 43681 1727204731.56683: Evaluated conditional (network_state != {}): False 43681 1727204731.56691: when evaluation is False, skipping this task 43681 1727204731.56785: _execute() done 43681 1727204731.56792: dumping result to json 43681 1727204731.56795: done dumping result, returning 43681 1727204731.56798: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9e86-7728-00000000009b] 43681 1727204731.56801: sending task result for task 12b410aa-8751-9e86-7728-00000000009b 43681 1727204731.56995: done sending task result for task 12b410aa-8751-9e86-7728-00000000009b skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204731.57057: no more pending results, returning what we have 43681 1727204731.57062: results queue empty 43681 1727204731.57063: checking for any_errors_fatal 43681 1727204731.57074: done checking for any_errors_fatal 43681 1727204731.57075: checking for max_fail_percentage 43681 1727204731.57077: done checking for max_fail_percentage 43681 1727204731.57078: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.57080: done checking to see if all hosts have failed 43681 1727204731.57081: getting the remaining hosts for this loop 43681 1727204731.57082: done getting the remaining hosts for this loop 43681 1727204731.57087: getting the next task for host managed-node3 43681 1727204731.57097: done getting next task for host managed-node3 43681 1727204731.57102: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204731.57105: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.57125: getting variables 43681 1727204731.57127: in VariableManager get_vars() 43681 1727204731.57176: Calling all_inventory to load vars for managed-node3 43681 1727204731.57179: Calling groups_inventory to load vars for managed-node3 43681 1727204731.57300: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.57325: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.57330: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.57334: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.58383: WORKER PROCESS EXITING 43681 1727204731.58756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.61282: done with get_vars() 43681 1727204731.61325: done getting variables 43681 1727204731.61426: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.067) 0:00:39.281 ***** 43681 1727204731.61468: entering _queue_task() for managed-node3/service 43681 1727204731.61839: worker is 1 (out of 1 available) 43681 1727204731.61856: exiting _queue_task() for managed-node3/service 43681 1727204731.61869: done queuing things up, now waiting for results queue to drain 43681 1727204731.61871: waiting for pending results... 43681 1727204731.62086: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 43681 1727204731.62169: in run() - task 12b410aa-8751-9e86-7728-00000000009c 43681 1727204731.62184: variable 'ansible_search_path' from source: unknown 43681 1727204731.62188: variable 'ansible_search_path' from source: unknown 43681 1727204731.62223: calling self._execute() 43681 1727204731.62312: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.62318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.62332: variable 'omit' from source: magic vars 43681 1727204731.62661: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.62672: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.62780: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.62958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204731.65494: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204731.65553: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204731.65596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204731.65629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204731.65654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204731.65730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.65758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.65780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.65814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.65828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.65872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.65894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.65915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.65950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.65965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.66002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.66024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.66043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.66077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.66091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.66240: variable 'network_connections' from source: play vars 43681 1727204731.66254: variable 'profile' from source: play vars 43681 1727204731.66325: variable 'profile' from source: play vars 43681 1727204731.66329: variable 'interface' from source: set_fact 43681 1727204731.66382: variable 'interface' from source: set_fact 43681 1727204731.66448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204731.66582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204731.66618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204731.66647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204731.66682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204731.66723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204731.66744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204731.66766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.66787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204731.66843: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204731.67044: variable 'network_connections' from source: play vars 43681 1727204731.67048: variable 'profile' from source: play vars 43681 1727204731.67106: variable 'profile' from source: play vars 43681 1727204731.67110: variable 'interface' from source: set_fact 43681 1727204731.67163: variable 'interface' from source: set_fact 43681 1727204731.67188: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 43681 1727204731.67193: when evaluation is False, skipping this task 43681 1727204731.67196: _execute() done 43681 1727204731.67200: dumping result to json 43681 1727204731.67205: done dumping result, returning 43681 1727204731.67213: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9e86-7728-00000000009c] 43681 1727204731.67228: sending task result for task 12b410aa-8751-9e86-7728-00000000009c 43681 1727204731.67324: done sending task result for task 12b410aa-8751-9e86-7728-00000000009c 43681 1727204731.67327: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 43681 1727204731.67380: no more pending results, returning what we have 43681 1727204731.67385: results queue empty 43681 1727204731.67386: checking for any_errors_fatal 43681 1727204731.67399: done checking for any_errors_fatal 43681 1727204731.67400: checking for max_fail_percentage 43681 1727204731.67402: done checking for max_fail_percentage 43681 1727204731.67403: checking to see if all hosts have failed and the running result is not ok 43681 1727204731.67404: done checking to see if all hosts have failed 43681 1727204731.67405: getting the remaining hosts for this loop 43681 1727204731.67407: done getting the remaining hosts for this loop 43681 1727204731.67411: getting the next task for host managed-node3 43681 1727204731.67418: done getting next task for host managed-node3 43681 1727204731.67425: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204731.67428: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204731.67445: getting variables 43681 1727204731.67447: in VariableManager get_vars() 43681 1727204731.67494: Calling all_inventory to load vars for managed-node3 43681 1727204731.67497: Calling groups_inventory to load vars for managed-node3 43681 1727204731.67503: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204731.67515: Calling all_plugins_play to load vars for managed-node3 43681 1727204731.67519: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204731.67524: Calling groups_plugins_play to load vars for managed-node3 43681 1727204731.69009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204731.70630: done with get_vars() 43681 1727204731.70657: done getting variables 43681 1727204731.70709: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.092) 0:00:39.374 ***** 43681 1727204731.70737: entering _queue_task() for managed-node3/service 43681 1727204731.71010: worker is 1 (out of 1 available) 43681 1727204731.71027: exiting _queue_task() for managed-node3/service 43681 1727204731.71041: done queuing things up, now waiting for results queue to drain 43681 1727204731.71043: waiting for pending results... 43681 1727204731.71245: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 43681 1727204731.71332: in run() - task 12b410aa-8751-9e86-7728-00000000009d 43681 1727204731.71344: variable 'ansible_search_path' from source: unknown 43681 1727204731.71349: variable 'ansible_search_path' from source: unknown 43681 1727204731.71381: calling self._execute() 43681 1727204731.71471: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.71478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.71489: variable 'omit' from source: magic vars 43681 1727204731.71816: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.71827: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204731.71968: variable 'network_provider' from source: set_fact 43681 1727204731.71972: variable 'network_state' from source: role '' defaults 43681 1727204731.71986: Evaluated conditional (network_provider == "nm" or network_state != {}): True 43681 1727204731.71995: variable 'omit' from source: magic vars 43681 1727204731.72028: variable 'omit' from source: magic vars 43681 1727204731.72051: variable 'network_service_name' from source: role '' defaults 43681 1727204731.72115: variable 'network_service_name' from source: role '' defaults 43681 1727204731.72207: variable '__network_provider_setup' from source: role '' defaults 43681 1727204731.72211: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204731.72266: variable '__network_service_name_default_nm' from source: role '' defaults 43681 1727204731.72274: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204731.72332: variable '__network_packages_default_nm' from source: role '' defaults 43681 1727204731.72532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204731.74265: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204731.74330: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204731.74364: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204731.74399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204731.74426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204731.74494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.74523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.74545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.74577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.74590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.74635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.74656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.74676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.74713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.74726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.74918: variable '__network_packages_default_gobject_packages' from source: role '' defaults 43681 1727204731.75019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.75043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.75063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.75097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.75110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.75192: variable 'ansible_python' from source: facts 43681 1727204731.75212: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 43681 1727204731.75282: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204731.75352: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204731.75459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.75483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.75505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.75538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.75551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.75597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204731.75623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204731.75643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.75673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204731.75686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204731.75803: variable 'network_connections' from source: play vars 43681 1727204731.75810: variable 'profile' from source: play vars 43681 1727204731.75871: variable 'profile' from source: play vars 43681 1727204731.75878: variable 'interface' from source: set_fact 43681 1727204731.75935: variable 'interface' from source: set_fact 43681 1727204731.76025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204731.76176: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204731.76219: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204731.76258: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204731.76294: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204731.76346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204731.76375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204731.76403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204731.76431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204731.76473: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.76706: variable 'network_connections' from source: play vars 43681 1727204731.76713: variable 'profile' from source: play vars 43681 1727204731.76778: variable 'profile' from source: play vars 43681 1727204731.76782: variable 'interface' from source: set_fact 43681 1727204731.76835: variable 'interface' from source: set_fact 43681 1727204731.76863: variable '__network_packages_default_wireless' from source: role '' defaults 43681 1727204731.76933: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204731.77172: variable 'network_connections' from source: play vars 43681 1727204731.77175: variable 'profile' from source: play vars 43681 1727204731.77238: variable 'profile' from source: play vars 43681 1727204731.77245: variable 'interface' from source: set_fact 43681 1727204731.77305: variable 'interface' from source: set_fact 43681 1727204731.77332: variable '__network_packages_default_team' from source: role '' defaults 43681 1727204731.77399: variable '__network_team_connections_defined' from source: role '' defaults 43681 1727204731.77641: variable 'network_connections' from source: play vars 43681 1727204731.77644: variable 'profile' from source: play vars 43681 1727204731.77708: variable 'profile' from source: play vars 43681 1727204731.77712: variable 'interface' from source: set_fact 43681 1727204731.77775: variable 'interface' from source: set_fact 43681 1727204731.77823: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204731.77872: variable '__network_service_name_default_initscripts' from source: role '' defaults 43681 1727204731.77879: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204731.77933: variable '__network_packages_default_initscripts' from source: role '' defaults 43681 1727204731.78115: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 43681 1727204731.78529: variable 'network_connections' from source: play vars 43681 1727204731.78534: variable 'profile' from source: play vars 43681 1727204731.78586: variable 'profile' from source: play vars 43681 1727204731.78591: variable 'interface' from source: set_fact 43681 1727204731.78650: variable 'interface' from source: set_fact 43681 1727204731.78660: variable 'ansible_distribution' from source: facts 43681 1727204731.78665: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.78673: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.78686: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 43681 1727204731.78833: variable 'ansible_distribution' from source: facts 43681 1727204731.78836: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.78843: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.78850: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 43681 1727204731.78997: variable 'ansible_distribution' from source: facts 43681 1727204731.79001: variable '__network_rh_distros' from source: role '' defaults 43681 1727204731.79007: variable 'ansible_distribution_major_version' from source: facts 43681 1727204731.79038: variable 'network_provider' from source: set_fact 43681 1727204731.79058: variable 'omit' from source: magic vars 43681 1727204731.79084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204731.79111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204731.79128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204731.79144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204731.79155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204731.79183: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204731.79188: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.79191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.79273: Set connection var ansible_shell_type to sh 43681 1727204731.79281: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204731.79287: Set connection var ansible_timeout to 10 43681 1727204731.79298: Set connection var ansible_pipelining to False 43681 1727204731.79308: Set connection var ansible_connection to ssh 43681 1727204731.79311: Set connection var ansible_shell_executable to /bin/sh 43681 1727204731.79335: variable 'ansible_shell_executable' from source: unknown 43681 1727204731.79338: variable 'ansible_connection' from source: unknown 43681 1727204731.79341: variable 'ansible_module_compression' from source: unknown 43681 1727204731.79344: variable 'ansible_shell_type' from source: unknown 43681 1727204731.79348: variable 'ansible_shell_executable' from source: unknown 43681 1727204731.79351: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204731.79358: variable 'ansible_pipelining' from source: unknown 43681 1727204731.79360: variable 'ansible_timeout' from source: unknown 43681 1727204731.79365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204731.79454: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204731.79464: variable 'omit' from source: magic vars 43681 1727204731.79471: starting attempt loop 43681 1727204731.79474: running the handler 43681 1727204731.79543: variable 'ansible_facts' from source: unknown 43681 1727204731.80379: _low_level_execute_command(): starting 43681 1727204731.80386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204731.80887: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204731.80918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.80924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.80983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204731.80987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204731.80995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204731.81036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204731.82801: stdout chunk (state=3): >>>/root <<< 43681 1727204731.82917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204731.82975: stderr chunk (state=3): >>><<< 43681 1727204731.82978: stdout chunk (state=3): >>><<< 43681 1727204731.82999: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204731.83010: _low_level_execute_command(): starting 43681 1727204731.83017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705 `" && echo ansible-tmp-1727204731.829996-45633-204396093780705="` echo /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705 `" ) && sleep 0' 43681 1727204731.83493: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.83497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204731.83500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.83502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204731.83505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.83556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204731.83559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204731.83604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204731.85597: stdout chunk (state=3): >>>ansible-tmp-1727204731.829996-45633-204396093780705=/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705 <<< 43681 1727204731.85718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204731.85766: stderr chunk (state=3): >>><<< 43681 1727204731.85769: stdout chunk (state=3): >>><<< 43681 1727204731.85785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204731.829996-45633-204396093780705=/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204731.85817: variable 'ansible_module_compression' from source: unknown 43681 1727204731.85861: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 43681 1727204731.85916: variable 'ansible_facts' from source: unknown 43681 1727204731.86055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py 43681 1727204731.86178: Sending initial data 43681 1727204731.86182: Sent initial data (155 bytes) 43681 1727204731.86641: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.86645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204731.86647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204731.86650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.86699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204731.86717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204731.86748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204731.88496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204731.88550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204731.88566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py" <<< 43681 1727204731.88577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpg83rsmal /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py <<< 43681 1727204731.88612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpg83rsmal" to remote "/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py" <<< 43681 1727204731.91231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204731.91236: stdout chunk (state=3): >>><<< 43681 1727204731.91243: stderr chunk (state=3): >>><<< 43681 1727204731.91271: done transferring module to remote 43681 1727204731.91296: _low_level_execute_command(): starting 43681 1727204731.91299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/ /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py && sleep 0' 43681 1727204731.92062: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.92105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.92146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204731.92176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204731.92183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204731.92331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204731.94258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204731.94262: stdout chunk (state=3): >>><<< 43681 1727204731.94495: stderr chunk (state=3): >>><<< 43681 1727204731.94499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204731.94502: _low_level_execute_command(): starting 43681 1727204731.94505: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/AnsiballZ_systemd.py && sleep 0' 43681 1727204731.94928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204731.94944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204731.94957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204731.94973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204731.94987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204731.94998: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204731.95010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.95027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204731.95034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204731.95042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204731.95112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204731.95149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204731.95168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204731.95187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204731.95299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.28546: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11874304", "MemoryAvailable": "infinity", "CPUUsageNSec": "2066359000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 43681 1727204732.28563: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 43681 1727204732.28582: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 43681 1727204732.30609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204732.30675: stderr chunk (state=3): >>><<< 43681 1727204732.30678: stdout chunk (state=3): >>><<< 43681 1727204732.30697: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11874304", "MemoryAvailable": "infinity", "CPUUsageNSec": "2066359000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service shutdown.target NetworkManager-wait-online.service network.service network.target", "After": "network-pre.target basic.target dbus-broker.service cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:02:42 EDT", "StateChangeTimestampMonotonic": "1066831351", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204732.30874: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204732.30893: _low_level_execute_command(): starting 43681 1727204732.30901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204731.829996-45633-204396093780705/ > /dev/null 2>&1 && sleep 0' 43681 1727204732.31392: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.31429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204732.31434: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204732.31437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.31495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204732.31502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204732.31504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.31539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.33536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204732.33569: stderr chunk (state=3): >>><<< 43681 1727204732.33573: stdout chunk (state=3): >>><<< 43681 1727204732.33695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204732.33700: handler run complete 43681 1727204732.33751: attempt loop complete, returning result 43681 1727204732.33754: _execute() done 43681 1727204732.33757: dumping result to json 43681 1727204732.33772: done dumping result, returning 43681 1727204732.33782: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9e86-7728-00000000009d] 43681 1727204732.33788: sending task result for task 12b410aa-8751-9e86-7728-00000000009d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204732.34158: no more pending results, returning what we have 43681 1727204732.34162: results queue empty 43681 1727204732.34163: checking for any_errors_fatal 43681 1727204732.34170: done checking for any_errors_fatal 43681 1727204732.34170: checking for max_fail_percentage 43681 1727204732.34172: done checking for max_fail_percentage 43681 1727204732.34173: checking to see if all hosts have failed and the running result is not ok 43681 1727204732.34174: done checking to see if all hosts have failed 43681 1727204732.34175: getting the remaining hosts for this loop 43681 1727204732.34177: done getting the remaining hosts for this loop 43681 1727204732.34182: getting the next task for host managed-node3 43681 1727204732.34187: done getting next task for host managed-node3 43681 1727204732.34193: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204732.34195: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204732.34207: getting variables 43681 1727204732.34208: in VariableManager get_vars() 43681 1727204732.34245: Calling all_inventory to load vars for managed-node3 43681 1727204732.34248: Calling groups_inventory to load vars for managed-node3 43681 1727204732.34251: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204732.34261: Calling all_plugins_play to load vars for managed-node3 43681 1727204732.34264: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204732.34267: Calling groups_plugins_play to load vars for managed-node3 43681 1727204732.35316: done sending task result for task 12b410aa-8751-9e86-7728-00000000009d 43681 1727204732.35321: WORKER PROCESS EXITING 43681 1727204732.35692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204732.38656: done with get_vars() 43681 1727204732.38704: done getting variables 43681 1727204732.38779: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:32 -0400 (0:00:00.680) 0:00:40.054 ***** 43681 1727204732.38820: entering _queue_task() for managed-node3/service 43681 1727204732.39202: worker is 1 (out of 1 available) 43681 1727204732.39218: exiting _queue_task() for managed-node3/service 43681 1727204732.39231: done queuing things up, now waiting for results queue to drain 43681 1727204732.39233: waiting for pending results... 43681 1727204732.39621: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 43681 1727204732.39681: in run() - task 12b410aa-8751-9e86-7728-00000000009e 43681 1727204732.39711: variable 'ansible_search_path' from source: unknown 43681 1727204732.39723: variable 'ansible_search_path' from source: unknown 43681 1727204732.39768: calling self._execute() 43681 1727204732.39884: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.39936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.39940: variable 'omit' from source: magic vars 43681 1727204732.40497: variable 'ansible_distribution_major_version' from source: facts 43681 1727204732.40500: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204732.40641: variable 'network_provider' from source: set_fact 43681 1727204732.40655: Evaluated conditional (network_provider == "nm"): True 43681 1727204732.40779: variable '__network_wpa_supplicant_required' from source: role '' defaults 43681 1727204732.40899: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 43681 1727204732.41283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204732.44088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204732.44168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204732.44228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204732.44278: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204732.44319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204732.44596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204732.44601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204732.44671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204732.45095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204732.45099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204732.45102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204732.45295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204732.45299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204732.45302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204732.45305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204732.45307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204732.45357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204732.45453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204732.45514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204732.45535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204732.45758: variable 'network_connections' from source: play vars 43681 1727204732.45932: variable 'profile' from source: play vars 43681 1727204732.45936: variable 'profile' from source: play vars 43681 1727204732.45939: variable 'interface' from source: set_fact 43681 1727204732.45977: variable 'interface' from source: set_fact 43681 1727204732.46084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 43681 1727204732.46320: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 43681 1727204732.46371: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 43681 1727204732.46423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 43681 1727204732.46462: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 43681 1727204732.46531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 43681 1727204732.46559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 43681 1727204732.46594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204732.46641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 43681 1727204732.46767: variable '__network_wireless_connections_defined' from source: role '' defaults 43681 1727204732.47210: variable 'network_connections' from source: play vars 43681 1727204732.47214: variable 'profile' from source: play vars 43681 1727204732.47216: variable 'profile' from source: play vars 43681 1727204732.47219: variable 'interface' from source: set_fact 43681 1727204732.47297: variable 'interface' from source: set_fact 43681 1727204732.47336: Evaluated conditional (__network_wpa_supplicant_required): False 43681 1727204732.47339: when evaluation is False, skipping this task 43681 1727204732.47342: _execute() done 43681 1727204732.47353: dumping result to json 43681 1727204732.47356: done dumping result, returning 43681 1727204732.47359: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9e86-7728-00000000009e] 43681 1727204732.47379: sending task result for task 12b410aa-8751-9e86-7728-00000000009e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 43681 1727204732.47543: no more pending results, returning what we have 43681 1727204732.47548: results queue empty 43681 1727204732.47550: checking for any_errors_fatal 43681 1727204732.47706: done checking for any_errors_fatal 43681 1727204732.47708: checking for max_fail_percentage 43681 1727204732.47711: done checking for max_fail_percentage 43681 1727204732.47712: checking to see if all hosts have failed and the running result is not ok 43681 1727204732.47713: done checking to see if all hosts have failed 43681 1727204732.47714: getting the remaining hosts for this loop 43681 1727204732.47716: done getting the remaining hosts for this loop 43681 1727204732.47725: getting the next task for host managed-node3 43681 1727204732.47732: done getting next task for host managed-node3 43681 1727204732.47737: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204732.47739: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204732.47755: getting variables 43681 1727204732.47757: in VariableManager get_vars() 43681 1727204732.47928: Calling all_inventory to load vars for managed-node3 43681 1727204732.47932: Calling groups_inventory to load vars for managed-node3 43681 1727204732.47934: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204732.47947: Calling all_plugins_play to load vars for managed-node3 43681 1727204732.47950: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204732.47954: Calling groups_plugins_play to load vars for managed-node3 43681 1727204732.48524: done sending task result for task 12b410aa-8751-9e86-7728-00000000009e 43681 1727204732.48529: WORKER PROCESS EXITING 43681 1727204732.50211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204732.52030: done with get_vars() 43681 1727204732.52072: done getting variables 43681 1727204732.52149: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:32 -0400 (0:00:00.133) 0:00:40.188 ***** 43681 1727204732.52185: entering _queue_task() for managed-node3/service 43681 1727204732.52563: worker is 1 (out of 1 available) 43681 1727204732.52576: exiting _queue_task() for managed-node3/service 43681 1727204732.52591: done queuing things up, now waiting for results queue to drain 43681 1727204732.52593: waiting for pending results... 43681 1727204732.52897: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 43681 1727204732.52979: in run() - task 12b410aa-8751-9e86-7728-00000000009f 43681 1727204732.52993: variable 'ansible_search_path' from source: unknown 43681 1727204732.52998: variable 'ansible_search_path' from source: unknown 43681 1727204732.53035: calling self._execute() 43681 1727204732.53124: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.53128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.53142: variable 'omit' from source: magic vars 43681 1727204732.53479: variable 'ansible_distribution_major_version' from source: facts 43681 1727204732.53498: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204732.53597: variable 'network_provider' from source: set_fact 43681 1727204732.53604: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204732.53607: when evaluation is False, skipping this task 43681 1727204732.53610: _execute() done 43681 1727204732.53615: dumping result to json 43681 1727204732.53620: done dumping result, returning 43681 1727204732.53630: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9e86-7728-00000000009f] 43681 1727204732.53636: sending task result for task 12b410aa-8751-9e86-7728-00000000009f 43681 1727204732.53731: done sending task result for task 12b410aa-8751-9e86-7728-00000000009f 43681 1727204732.53735: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 43681 1727204732.53787: no more pending results, returning what we have 43681 1727204732.53794: results queue empty 43681 1727204732.53795: checking for any_errors_fatal 43681 1727204732.53804: done checking for any_errors_fatal 43681 1727204732.53805: checking for max_fail_percentage 43681 1727204732.53807: done checking for max_fail_percentage 43681 1727204732.53808: checking to see if all hosts have failed and the running result is not ok 43681 1727204732.53809: done checking to see if all hosts have failed 43681 1727204732.53810: getting the remaining hosts for this loop 43681 1727204732.53812: done getting the remaining hosts for this loop 43681 1727204732.53816: getting the next task for host managed-node3 43681 1727204732.53824: done getting next task for host managed-node3 43681 1727204732.53829: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204732.53832: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204732.53856: getting variables 43681 1727204732.53858: in VariableManager get_vars() 43681 1727204732.53900: Calling all_inventory to load vars for managed-node3 43681 1727204732.53903: Calling groups_inventory to load vars for managed-node3 43681 1727204732.53905: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204732.53915: Calling all_plugins_play to load vars for managed-node3 43681 1727204732.53918: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204732.53924: Calling groups_plugins_play to load vars for managed-node3 43681 1727204732.55662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204732.57668: done with get_vars() 43681 1727204732.57697: done getting variables 43681 1727204732.57752: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:32 -0400 (0:00:00.055) 0:00:40.244 ***** 43681 1727204732.57780: entering _queue_task() for managed-node3/copy 43681 1727204732.58052: worker is 1 (out of 1 available) 43681 1727204732.58068: exiting _queue_task() for managed-node3/copy 43681 1727204732.58080: done queuing things up, now waiting for results queue to drain 43681 1727204732.58082: waiting for pending results... 43681 1727204732.58408: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 43681 1727204732.58595: in run() - task 12b410aa-8751-9e86-7728-0000000000a0 43681 1727204732.58599: variable 'ansible_search_path' from source: unknown 43681 1727204732.58603: variable 'ansible_search_path' from source: unknown 43681 1727204732.58606: calling self._execute() 43681 1727204732.58631: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.58644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.58666: variable 'omit' from source: magic vars 43681 1727204732.59118: variable 'ansible_distribution_major_version' from source: facts 43681 1727204732.59136: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204732.59291: variable 'network_provider' from source: set_fact 43681 1727204732.59304: Evaluated conditional (network_provider == "initscripts"): False 43681 1727204732.59312: when evaluation is False, skipping this task 43681 1727204732.59325: _execute() done 43681 1727204732.59337: dumping result to json 43681 1727204732.59340: done dumping result, returning 43681 1727204732.59350: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9e86-7728-0000000000a0] 43681 1727204732.59355: sending task result for task 12b410aa-8751-9e86-7728-0000000000a0 43681 1727204732.59465: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a0 43681 1727204732.59468: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 43681 1727204732.59537: no more pending results, returning what we have 43681 1727204732.59542: results queue empty 43681 1727204732.59543: checking for any_errors_fatal 43681 1727204732.59551: done checking for any_errors_fatal 43681 1727204732.59552: checking for max_fail_percentage 43681 1727204732.59555: done checking for max_fail_percentage 43681 1727204732.59556: checking to see if all hosts have failed and the running result is not ok 43681 1727204732.59557: done checking to see if all hosts have failed 43681 1727204732.59558: getting the remaining hosts for this loop 43681 1727204732.59559: done getting the remaining hosts for this loop 43681 1727204732.59563: getting the next task for host managed-node3 43681 1727204732.59569: done getting next task for host managed-node3 43681 1727204732.59574: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204732.59576: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204732.59593: getting variables 43681 1727204732.59594: in VariableManager get_vars() 43681 1727204732.59632: Calling all_inventory to load vars for managed-node3 43681 1727204732.59635: Calling groups_inventory to load vars for managed-node3 43681 1727204732.59637: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204732.59647: Calling all_plugins_play to load vars for managed-node3 43681 1727204732.59651: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204732.59654: Calling groups_plugins_play to load vars for managed-node3 43681 1727204732.61556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204732.63466: done with get_vars() 43681 1727204732.63506: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:32 -0400 (0:00:00.058) 0:00:40.302 ***** 43681 1727204732.63609: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204732.63986: worker is 1 (out of 1 available) 43681 1727204732.64004: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 43681 1727204732.64018: done queuing things up, now waiting for results queue to drain 43681 1727204732.64020: waiting for pending results... 43681 1727204732.64256: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 43681 1727204732.64343: in run() - task 12b410aa-8751-9e86-7728-0000000000a1 43681 1727204732.64357: variable 'ansible_search_path' from source: unknown 43681 1727204732.64361: variable 'ansible_search_path' from source: unknown 43681 1727204732.64397: calling self._execute() 43681 1727204732.64486: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.64496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.64507: variable 'omit' from source: magic vars 43681 1727204732.64845: variable 'ansible_distribution_major_version' from source: facts 43681 1727204732.64858: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204732.64865: variable 'omit' from source: magic vars 43681 1727204732.64899: variable 'omit' from source: magic vars 43681 1727204732.65045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 43681 1727204732.66798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 43681 1727204732.66854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 43681 1727204732.66887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 43681 1727204732.66921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 43681 1727204732.66948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 43681 1727204732.67021: variable 'network_provider' from source: set_fact 43681 1727204732.67138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 43681 1727204732.67175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 43681 1727204732.67199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 43681 1727204732.67236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 43681 1727204732.67252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 43681 1727204732.67314: variable 'omit' from source: magic vars 43681 1727204732.67413: variable 'omit' from source: magic vars 43681 1727204732.67502: variable 'network_connections' from source: play vars 43681 1727204732.67513: variable 'profile' from source: play vars 43681 1727204732.67574: variable 'profile' from source: play vars 43681 1727204732.67579: variable 'interface' from source: set_fact 43681 1727204732.67633: variable 'interface' from source: set_fact 43681 1727204732.67755: variable 'omit' from source: magic vars 43681 1727204732.67763: variable '__lsr_ansible_managed' from source: task vars 43681 1727204732.67816: variable '__lsr_ansible_managed' from source: task vars 43681 1727204732.68058: Loaded config def from plugin (lookup/template) 43681 1727204732.68064: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 43681 1727204732.68092: File lookup term: get_ansible_managed.j2 43681 1727204732.68096: variable 'ansible_search_path' from source: unknown 43681 1727204732.68106: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 43681 1727204732.68118: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 43681 1727204732.68137: variable 'ansible_search_path' from source: unknown 43681 1727204732.78300: variable 'ansible_managed' from source: unknown 43681 1727204732.78441: variable 'omit' from source: magic vars 43681 1727204732.78463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204732.78483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204732.78500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204732.78516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204732.78527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204732.78544: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204732.78548: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.78553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.78629: Set connection var ansible_shell_type to sh 43681 1727204732.78636: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204732.78644: Set connection var ansible_timeout to 10 43681 1727204732.78652: Set connection var ansible_pipelining to False 43681 1727204732.78658: Set connection var ansible_connection to ssh 43681 1727204732.78664: Set connection var ansible_shell_executable to /bin/sh 43681 1727204732.78683: variable 'ansible_shell_executable' from source: unknown 43681 1727204732.78686: variable 'ansible_connection' from source: unknown 43681 1727204732.78692: variable 'ansible_module_compression' from source: unknown 43681 1727204732.78695: variable 'ansible_shell_type' from source: unknown 43681 1727204732.78699: variable 'ansible_shell_executable' from source: unknown 43681 1727204732.78703: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204732.78709: variable 'ansible_pipelining' from source: unknown 43681 1727204732.78711: variable 'ansible_timeout' from source: unknown 43681 1727204732.78717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204732.78822: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204732.78835: variable 'omit' from source: magic vars 43681 1727204732.78846: starting attempt loop 43681 1727204732.78850: running the handler 43681 1727204732.78856: _low_level_execute_command(): starting 43681 1727204732.78862: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204732.79355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.79398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204732.79401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.79403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.79406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.79458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204732.79461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204732.79465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.79513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.81252: stdout chunk (state=3): >>>/root <<< 43681 1727204732.81361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204732.81413: stderr chunk (state=3): >>><<< 43681 1727204732.81416: stdout chunk (state=3): >>><<< 43681 1727204732.81442: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204732.81451: _low_level_execute_command(): starting 43681 1727204732.81458: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778 `" && echo ansible-tmp-1727204732.8144078-45671-158995835750778="` echo /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778 `" ) && sleep 0' 43681 1727204732.81929: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204732.81932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204732.81935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204732.81938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204732.81940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.81997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204732.82013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.82037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.84011: stdout chunk (state=3): >>>ansible-tmp-1727204732.8144078-45671-158995835750778=/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778 <<< 43681 1727204732.84121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204732.84165: stderr chunk (state=3): >>><<< 43681 1727204732.84169: stdout chunk (state=3): >>><<< 43681 1727204732.84187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204732.8144078-45671-158995835750778=/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204732.84228: variable 'ansible_module_compression' from source: unknown 43681 1727204732.84261: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 43681 1727204732.84302: variable 'ansible_facts' from source: unknown 43681 1727204732.84394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py 43681 1727204732.84503: Sending initial data 43681 1727204732.84507: Sent initial data (168 bytes) 43681 1727204732.84950: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.84954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.84961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.84964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.85013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204732.85017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.85058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.86663: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204732.86731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204732.86769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp4ikfrtq8 /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py <<< 43681 1727204732.86772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py" <<< 43681 1727204732.86804: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp4ikfrtq8" to remote "/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py" <<< 43681 1727204732.88663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204732.88667: stderr chunk (state=3): >>><<< 43681 1727204732.88669: stdout chunk (state=3): >>><<< 43681 1727204732.88672: done transferring module to remote 43681 1727204732.88674: _low_level_execute_command(): starting 43681 1727204732.88677: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/ /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py && sleep 0' 43681 1727204732.89506: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204732.89588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204732.89615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204732.89630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.89714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204732.91620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204732.91623: stdout chunk (state=3): >>><<< 43681 1727204732.91626: stderr chunk (state=3): >>><<< 43681 1727204732.91644: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204732.91733: _low_level_execute_command(): starting 43681 1727204732.91736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/AnsiballZ_network_connections.py && sleep 0' 43681 1727204732.92233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204732.92247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204732.92262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204732.92281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204732.92387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204732.92422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204732.92495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.23109: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 43681 1727204733.23133: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ic81bgdc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ic81bgdc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4ec82e11-94f9-4a67-a45f-0542deb3f3a9: error=unknown <<< 43681 1727204733.23307: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 43681 1727204733.25406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204733.25430: stderr chunk (state=3): >>><<< 43681 1727204733.25441: stdout chunk (state=3): >>><<< 43681 1727204733.25596: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ic81bgdc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ic81bgdc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4ec82e11-94f9-4a67-a45f-0542deb3f3a9: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204733.25600: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204733.25603: _low_level_execute_command(): starting 43681 1727204733.25605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204732.8144078-45671-158995835750778/ > /dev/null 2>&1 && sleep 0' 43681 1727204733.26233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204733.26248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204733.26266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.26394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204733.26413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.26477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.28404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.28507: stderr chunk (state=3): >>><<< 43681 1727204733.28698: stdout chunk (state=3): >>><<< 43681 1727204733.28701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204733.28704: handler run complete 43681 1727204733.28707: attempt loop complete, returning result 43681 1727204733.28709: _execute() done 43681 1727204733.28711: dumping result to json 43681 1727204733.28714: done dumping result, returning 43681 1727204733.28716: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9e86-7728-0000000000a1] 43681 1727204733.28718: sending task result for task 12b410aa-8751-9e86-7728-0000000000a1 43681 1727204733.28801: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a1 43681 1727204733.28810: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 43681 1727204733.28945: no more pending results, returning what we have 43681 1727204733.28950: results queue empty 43681 1727204733.28951: checking for any_errors_fatal 43681 1727204733.28958: done checking for any_errors_fatal 43681 1727204733.28959: checking for max_fail_percentage 43681 1727204733.28961: done checking for max_fail_percentage 43681 1727204733.28962: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.28963: done checking to see if all hosts have failed 43681 1727204733.28964: getting the remaining hosts for this loop 43681 1727204733.28966: done getting the remaining hosts for this loop 43681 1727204733.28971: getting the next task for host managed-node3 43681 1727204733.28977: done getting next task for host managed-node3 43681 1727204733.28981: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204733.28984: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.29040: getting variables 43681 1727204733.29042: in VariableManager get_vars() 43681 1727204733.29085: Calling all_inventory to load vars for managed-node3 43681 1727204733.29090: Calling groups_inventory to load vars for managed-node3 43681 1727204733.29094: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.29225: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.29230: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.29235: Calling groups_plugins_play to load vars for managed-node3 43681 1727204733.32022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204733.35273: done with get_vars() 43681 1727204733.35319: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.718) 0:00:41.020 ***** 43681 1727204733.35432: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204733.35768: worker is 1 (out of 1 available) 43681 1727204733.35784: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 43681 1727204733.35799: done queuing things up, now waiting for results queue to drain 43681 1727204733.35801: waiting for pending results... 43681 1727204733.36013: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 43681 1727204733.36103: in run() - task 12b410aa-8751-9e86-7728-0000000000a2 43681 1727204733.36116: variable 'ansible_search_path' from source: unknown 43681 1727204733.36120: variable 'ansible_search_path' from source: unknown 43681 1727204733.36157: calling self._execute() 43681 1727204733.36248: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.36252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.36264: variable 'omit' from source: magic vars 43681 1727204733.36600: variable 'ansible_distribution_major_version' from source: facts 43681 1727204733.36610: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204733.36716: variable 'network_state' from source: role '' defaults 43681 1727204733.36727: Evaluated conditional (network_state != {}): False 43681 1727204733.36731: when evaluation is False, skipping this task 43681 1727204733.36734: _execute() done 43681 1727204733.36737: dumping result to json 43681 1727204733.36742: done dumping result, returning 43681 1727204733.36750: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9e86-7728-0000000000a2] 43681 1727204733.36756: sending task result for task 12b410aa-8751-9e86-7728-0000000000a2 43681 1727204733.36854: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a2 43681 1727204733.36857: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 43681 1727204733.36915: no more pending results, returning what we have 43681 1727204733.36920: results queue empty 43681 1727204733.36924: checking for any_errors_fatal 43681 1727204733.36936: done checking for any_errors_fatal 43681 1727204733.36936: checking for max_fail_percentage 43681 1727204733.36938: done checking for max_fail_percentage 43681 1727204733.36939: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.36940: done checking to see if all hosts have failed 43681 1727204733.36941: getting the remaining hosts for this loop 43681 1727204733.36944: done getting the remaining hosts for this loop 43681 1727204733.36948: getting the next task for host managed-node3 43681 1727204733.36954: done getting next task for host managed-node3 43681 1727204733.36958: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204733.36961: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.36976: getting variables 43681 1727204733.36978: in VariableManager get_vars() 43681 1727204733.37016: Calling all_inventory to load vars for managed-node3 43681 1727204733.37019: Calling groups_inventory to load vars for managed-node3 43681 1727204733.37025: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.37034: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.37037: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.37041: Calling groups_plugins_play to load vars for managed-node3 43681 1727204733.39288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204733.41098: done with get_vars() 43681 1727204733.41123: done getting variables 43681 1727204733.41176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.057) 0:00:41.078 ***** 43681 1727204733.41204: entering _queue_task() for managed-node3/debug 43681 1727204733.41455: worker is 1 (out of 1 available) 43681 1727204733.41470: exiting _queue_task() for managed-node3/debug 43681 1727204733.41482: done queuing things up, now waiting for results queue to drain 43681 1727204733.41484: waiting for pending results... 43681 1727204733.42007: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 43681 1727204733.42012: in run() - task 12b410aa-8751-9e86-7728-0000000000a3 43681 1727204733.42015: variable 'ansible_search_path' from source: unknown 43681 1727204733.42018: variable 'ansible_search_path' from source: unknown 43681 1727204733.42023: calling self._execute() 43681 1727204733.42055: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.42070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.42086: variable 'omit' from source: magic vars 43681 1727204733.42592: variable 'ansible_distribution_major_version' from source: facts 43681 1727204733.42614: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204733.42633: variable 'omit' from source: magic vars 43681 1727204733.42694: variable 'omit' from source: magic vars 43681 1727204733.42727: variable 'omit' from source: magic vars 43681 1727204733.42763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204733.42802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204733.42816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204733.42833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.42845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.42879: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204733.42882: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.42885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.42972: Set connection var ansible_shell_type to sh 43681 1727204733.42979: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204733.42985: Set connection var ansible_timeout to 10 43681 1727204733.42995: Set connection var ansible_pipelining to False 43681 1727204733.43002: Set connection var ansible_connection to ssh 43681 1727204733.43010: Set connection var ansible_shell_executable to /bin/sh 43681 1727204733.43033: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.43036: variable 'ansible_connection' from source: unknown 43681 1727204733.43039: variable 'ansible_module_compression' from source: unknown 43681 1727204733.43041: variable 'ansible_shell_type' from source: unknown 43681 1727204733.43046: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.43049: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.43054: variable 'ansible_pipelining' from source: unknown 43681 1727204733.43057: variable 'ansible_timeout' from source: unknown 43681 1727204733.43063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.43187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204733.43199: variable 'omit' from source: magic vars 43681 1727204733.43206: starting attempt loop 43681 1727204733.43209: running the handler 43681 1727204733.43315: variable '__network_connections_result' from source: set_fact 43681 1727204733.43365: handler run complete 43681 1727204733.43382: attempt loop complete, returning result 43681 1727204733.43385: _execute() done 43681 1727204733.43388: dumping result to json 43681 1727204733.43395: done dumping result, returning 43681 1727204733.43404: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9e86-7728-0000000000a3] 43681 1727204733.43410: sending task result for task 12b410aa-8751-9e86-7728-0000000000a3 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 43681 1727204733.43572: no more pending results, returning what we have 43681 1727204733.43577: results queue empty 43681 1727204733.43578: checking for any_errors_fatal 43681 1727204733.43583: done checking for any_errors_fatal 43681 1727204733.43584: checking for max_fail_percentage 43681 1727204733.43585: done checking for max_fail_percentage 43681 1727204733.43587: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.43588: done checking to see if all hosts have failed 43681 1727204733.43590: getting the remaining hosts for this loop 43681 1727204733.43592: done getting the remaining hosts for this loop 43681 1727204733.43596: getting the next task for host managed-node3 43681 1727204733.43604: done getting next task for host managed-node3 43681 1727204733.43608: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204733.43610: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.43620: getting variables 43681 1727204733.43624: in VariableManager get_vars() 43681 1727204733.43659: Calling all_inventory to load vars for managed-node3 43681 1727204733.43661: Calling groups_inventory to load vars for managed-node3 43681 1727204733.43664: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.43674: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.43677: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.43680: Calling groups_plugins_play to load vars for managed-node3 43681 1727204733.44206: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a3 43681 1727204733.44210: WORKER PROCESS EXITING 43681 1727204733.48701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204733.51841: done with get_vars() 43681 1727204733.51902: done getting variables 43681 1727204733.51970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.108) 0:00:41.186 ***** 43681 1727204733.52016: entering _queue_task() for managed-node3/debug 43681 1727204733.52455: worker is 1 (out of 1 available) 43681 1727204733.52471: exiting _queue_task() for managed-node3/debug 43681 1727204733.52484: done queuing things up, now waiting for results queue to drain 43681 1727204733.52486: waiting for pending results... 43681 1727204733.52812: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 43681 1727204733.52874: in run() - task 12b410aa-8751-9e86-7728-0000000000a4 43681 1727204733.52903: variable 'ansible_search_path' from source: unknown 43681 1727204733.52915: variable 'ansible_search_path' from source: unknown 43681 1727204733.52964: calling self._execute() 43681 1727204733.53088: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.53111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.53133: variable 'omit' from source: magic vars 43681 1727204733.53620: variable 'ansible_distribution_major_version' from source: facts 43681 1727204733.53632: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204733.53647: variable 'omit' from source: magic vars 43681 1727204733.53679: variable 'omit' from source: magic vars 43681 1727204733.53714: variable 'omit' from source: magic vars 43681 1727204733.53751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204733.53784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204733.53806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204733.53824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.53833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.53865: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204733.53869: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.53872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.53962: Set connection var ansible_shell_type to sh 43681 1727204733.53966: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204733.53976: Set connection var ansible_timeout to 10 43681 1727204733.53989: Set connection var ansible_pipelining to False 43681 1727204733.53994: Set connection var ansible_connection to ssh 43681 1727204733.54001: Set connection var ansible_shell_executable to /bin/sh 43681 1727204733.54023: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.54027: variable 'ansible_connection' from source: unknown 43681 1727204733.54030: variable 'ansible_module_compression' from source: unknown 43681 1727204733.54033: variable 'ansible_shell_type' from source: unknown 43681 1727204733.54036: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.54038: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.54043: variable 'ansible_pipelining' from source: unknown 43681 1727204733.54045: variable 'ansible_timeout' from source: unknown 43681 1727204733.54051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.54171: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204733.54185: variable 'omit' from source: magic vars 43681 1727204733.54191: starting attempt loop 43681 1727204733.54199: running the handler 43681 1727204733.54242: variable '__network_connections_result' from source: set_fact 43681 1727204733.54311: variable '__network_connections_result' from source: set_fact 43681 1727204733.54402: handler run complete 43681 1727204733.54430: attempt loop complete, returning result 43681 1727204733.54433: _execute() done 43681 1727204733.54436: dumping result to json 43681 1727204733.54439: done dumping result, returning 43681 1727204733.54449: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9e86-7728-0000000000a4] 43681 1727204733.54455: sending task result for task 12b410aa-8751-9e86-7728-0000000000a4 43681 1727204733.54554: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a4 43681 1727204733.54557: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 43681 1727204733.54651: no more pending results, returning what we have 43681 1727204733.54656: results queue empty 43681 1727204733.54657: checking for any_errors_fatal 43681 1727204733.54667: done checking for any_errors_fatal 43681 1727204733.54668: checking for max_fail_percentage 43681 1727204733.54670: done checking for max_fail_percentage 43681 1727204733.54671: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.54672: done checking to see if all hosts have failed 43681 1727204733.54673: getting the remaining hosts for this loop 43681 1727204733.54675: done getting the remaining hosts for this loop 43681 1727204733.54679: getting the next task for host managed-node3 43681 1727204733.54686: done getting next task for host managed-node3 43681 1727204733.54698: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204733.54700: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.54711: getting variables 43681 1727204733.54713: in VariableManager get_vars() 43681 1727204733.54751: Calling all_inventory to load vars for managed-node3 43681 1727204733.54755: Calling groups_inventory to load vars for managed-node3 43681 1727204733.54758: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.54768: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.54771: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.54774: Calling groups_plugins_play to load vars for managed-node3 43681 1727204733.56043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204733.57672: done with get_vars() 43681 1727204733.57698: done getting variables 43681 1727204733.57753: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.057) 0:00:41.244 ***** 43681 1727204733.57783: entering _queue_task() for managed-node3/debug 43681 1727204733.58038: worker is 1 (out of 1 available) 43681 1727204733.58053: exiting _queue_task() for managed-node3/debug 43681 1727204733.58067: done queuing things up, now waiting for results queue to drain 43681 1727204733.58069: waiting for pending results... 43681 1727204733.58261: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 43681 1727204733.58356: in run() - task 12b410aa-8751-9e86-7728-0000000000a5 43681 1727204733.58369: variable 'ansible_search_path' from source: unknown 43681 1727204733.58373: variable 'ansible_search_path' from source: unknown 43681 1727204733.58409: calling self._execute() 43681 1727204733.58500: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.58507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.58524: variable 'omit' from source: magic vars 43681 1727204733.58843: variable 'ansible_distribution_major_version' from source: facts 43681 1727204733.58854: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204733.58961: variable 'network_state' from source: role '' defaults 43681 1727204733.58972: Evaluated conditional (network_state != {}): False 43681 1727204733.58975: when evaluation is False, skipping this task 43681 1727204733.58978: _execute() done 43681 1727204733.58981: dumping result to json 43681 1727204733.58984: done dumping result, returning 43681 1727204733.58993: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9e86-7728-0000000000a5] 43681 1727204733.59000: sending task result for task 12b410aa-8751-9e86-7728-0000000000a5 skipping: [managed-node3] => { "false_condition": "network_state != {}" } 43681 1727204733.59158: no more pending results, returning what we have 43681 1727204733.59162: results queue empty 43681 1727204733.59163: checking for any_errors_fatal 43681 1727204733.59173: done checking for any_errors_fatal 43681 1727204733.59174: checking for max_fail_percentage 43681 1727204733.59175: done checking for max_fail_percentage 43681 1727204733.59176: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.59177: done checking to see if all hosts have failed 43681 1727204733.59178: getting the remaining hosts for this loop 43681 1727204733.59180: done getting the remaining hosts for this loop 43681 1727204733.59185: getting the next task for host managed-node3 43681 1727204733.59191: done getting next task for host managed-node3 43681 1727204733.59196: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204733.59199: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.59212: getting variables 43681 1727204733.59214: in VariableManager get_vars() 43681 1727204733.59250: Calling all_inventory to load vars for managed-node3 43681 1727204733.59253: Calling groups_inventory to load vars for managed-node3 43681 1727204733.59255: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.59265: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.59268: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.59271: Calling groups_plugins_play to load vars for managed-node3 43681 1727204733.60319: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a5 43681 1727204733.60324: WORKER PROCESS EXITING 43681 1727204733.60665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204733.62280: done with get_vars() 43681 1727204733.62305: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.046) 0:00:41.290 ***** 43681 1727204733.62383: entering _queue_task() for managed-node3/ping 43681 1727204733.62626: worker is 1 (out of 1 available) 43681 1727204733.62642: exiting _queue_task() for managed-node3/ping 43681 1727204733.62653: done queuing things up, now waiting for results queue to drain 43681 1727204733.62655: waiting for pending results... 43681 1727204733.62846: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 43681 1727204733.62924: in run() - task 12b410aa-8751-9e86-7728-0000000000a6 43681 1727204733.62936: variable 'ansible_search_path' from source: unknown 43681 1727204733.62939: variable 'ansible_search_path' from source: unknown 43681 1727204733.62970: calling self._execute() 43681 1727204733.63062: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.63069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.63080: variable 'omit' from source: magic vars 43681 1727204733.63410: variable 'ansible_distribution_major_version' from source: facts 43681 1727204733.63426: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204733.63432: variable 'omit' from source: magic vars 43681 1727204733.63465: variable 'omit' from source: magic vars 43681 1727204733.63496: variable 'omit' from source: magic vars 43681 1727204733.63532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204733.63567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204733.63586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204733.63603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.63614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204733.63645: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204733.63648: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.63653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.63738: Set connection var ansible_shell_type to sh 43681 1727204733.63744: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204733.63752: Set connection var ansible_timeout to 10 43681 1727204733.63768: Set connection var ansible_pipelining to False 43681 1727204733.63775: Set connection var ansible_connection to ssh 43681 1727204733.63778: Set connection var ansible_shell_executable to /bin/sh 43681 1727204733.63797: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.63800: variable 'ansible_connection' from source: unknown 43681 1727204733.63803: variable 'ansible_module_compression' from source: unknown 43681 1727204733.63808: variable 'ansible_shell_type' from source: unknown 43681 1727204733.63811: variable 'ansible_shell_executable' from source: unknown 43681 1727204733.63815: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204733.63820: variable 'ansible_pipelining' from source: unknown 43681 1727204733.63826: variable 'ansible_timeout' from source: unknown 43681 1727204733.63828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204733.64001: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204733.64013: variable 'omit' from source: magic vars 43681 1727204733.64019: starting attempt loop 43681 1727204733.64025: running the handler 43681 1727204733.64036: _low_level_execute_command(): starting 43681 1727204733.64044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204733.64573: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.64608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204733.64612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.64614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.64617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.64673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204733.64676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.64729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.66493: stdout chunk (state=3): >>>/root <<< 43681 1727204733.66610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.66660: stderr chunk (state=3): >>><<< 43681 1727204733.66665: stdout chunk (state=3): >>><<< 43681 1727204733.66690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204733.66701: _low_level_execute_command(): starting 43681 1727204733.66707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196 `" && echo ansible-tmp-1727204733.6668482-45705-73671934078196="` echo /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196 `" ) && sleep 0' 43681 1727204733.67165: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.67168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.67171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.67180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.67224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204733.67228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.67272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.70739: stdout chunk (state=3): >>>ansible-tmp-1727204733.6668482-45705-73671934078196=/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196 <<< 43681 1727204733.70857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.70905: stderr chunk (state=3): >>><<< 43681 1727204733.70910: stdout chunk (state=3): >>><<< 43681 1727204733.70930: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204733.6668482-45705-73671934078196=/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204733.70970: variable 'ansible_module_compression' from source: unknown 43681 1727204733.71007: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 43681 1727204733.71045: variable 'ansible_facts' from source: unknown 43681 1727204733.71101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py 43681 1727204733.71216: Sending initial data 43681 1727204733.71220: Sent initial data (152 bytes) 43681 1727204733.71652: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.71697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204733.71700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.71703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 43681 1727204733.71706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204733.71708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.71752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204733.71755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.71799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.73404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204733.73440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204733.73474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp7z7078up /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py <<< 43681 1727204733.73478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py" <<< 43681 1727204733.73506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp7z7078up" to remote "/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py" <<< 43681 1727204733.73514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py" <<< 43681 1727204733.74236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.74309: stderr chunk (state=3): >>><<< 43681 1727204733.74312: stdout chunk (state=3): >>><<< 43681 1727204733.74335: done transferring module to remote 43681 1727204733.74346: _low_level_execute_command(): starting 43681 1727204733.74352: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/ /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py && sleep 0' 43681 1727204733.74833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204733.74839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204733.74842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 43681 1727204733.74845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204733.74849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.74901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204733.74912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.74945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.76753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.76801: stderr chunk (state=3): >>><<< 43681 1727204733.76808: stdout chunk (state=3): >>><<< 43681 1727204733.76827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204733.76830: _low_level_execute_command(): starting 43681 1727204733.76833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/AnsiballZ_ping.py && sleep 0' 43681 1727204733.77266: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204733.77276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204733.77308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.77311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204733.77314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.77375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204733.77381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.77415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.94231: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 43681 1727204733.95604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204733.95662: stderr chunk (state=3): >>><<< 43681 1727204733.95666: stdout chunk (state=3): >>><<< 43681 1727204733.95681: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204733.95710: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204733.95727: _low_level_execute_command(): starting 43681 1727204733.95730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204733.6668482-45705-73671934078196/ > /dev/null 2>&1 && sleep 0' 43681 1727204733.96193: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204733.96228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204733.96232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204733.96235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.96237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204733.96239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204733.96304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204733.96307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204733.96309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204733.96342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204733.98242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204733.98285: stderr chunk (state=3): >>><<< 43681 1727204733.98291: stdout chunk (state=3): >>><<< 43681 1727204733.98310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204733.98318: handler run complete 43681 1727204733.98334: attempt loop complete, returning result 43681 1727204733.98337: _execute() done 43681 1727204733.98339: dumping result to json 43681 1727204733.98345: done dumping result, returning 43681 1727204733.98354: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9e86-7728-0000000000a6] 43681 1727204733.98360: sending task result for task 12b410aa-8751-9e86-7728-0000000000a6 43681 1727204733.98457: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a6 43681 1727204733.98460: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 43681 1727204733.98554: no more pending results, returning what we have 43681 1727204733.98558: results queue empty 43681 1727204733.98559: checking for any_errors_fatal 43681 1727204733.98567: done checking for any_errors_fatal 43681 1727204733.98568: checking for max_fail_percentage 43681 1727204733.98572: done checking for max_fail_percentage 43681 1727204733.98573: checking to see if all hosts have failed and the running result is not ok 43681 1727204733.98574: done checking to see if all hosts have failed 43681 1727204733.98575: getting the remaining hosts for this loop 43681 1727204733.98576: done getting the remaining hosts for this loop 43681 1727204733.98581: getting the next task for host managed-node3 43681 1727204733.98588: done getting next task for host managed-node3 43681 1727204733.98592: ^ task is: TASK: meta (role_complete) 43681 1727204733.98594: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204733.98605: getting variables 43681 1727204733.98607: in VariableManager get_vars() 43681 1727204733.98646: Calling all_inventory to load vars for managed-node3 43681 1727204733.98649: Calling groups_inventory to load vars for managed-node3 43681 1727204733.98652: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204733.98662: Calling all_plugins_play to load vars for managed-node3 43681 1727204733.98665: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204733.98668: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.00063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.01685: done with get_vars() 43681 1727204734.01711: done getting variables 43681 1727204734.01782: done queuing things up, now waiting for results queue to drain 43681 1727204734.01784: results queue empty 43681 1727204734.01785: checking for any_errors_fatal 43681 1727204734.01787: done checking for any_errors_fatal 43681 1727204734.01788: checking for max_fail_percentage 43681 1727204734.01788: done checking for max_fail_percentage 43681 1727204734.01791: checking to see if all hosts have failed and the running result is not ok 43681 1727204734.01791: done checking to see if all hosts have failed 43681 1727204734.01792: getting the remaining hosts for this loop 43681 1727204734.01793: done getting the remaining hosts for this loop 43681 1727204734.01795: getting the next task for host managed-node3 43681 1727204734.01798: done getting next task for host managed-node3 43681 1727204734.01799: ^ task is: TASK: meta (flush_handlers) 43681 1727204734.01800: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204734.01802: getting variables 43681 1727204734.01803: in VariableManager get_vars() 43681 1727204734.01814: Calling all_inventory to load vars for managed-node3 43681 1727204734.01816: Calling groups_inventory to load vars for managed-node3 43681 1727204734.01817: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204734.01822: Calling all_plugins_play to load vars for managed-node3 43681 1727204734.01824: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204734.01826: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.02947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.04571: done with get_vars() 43681 1727204734.04595: done getting variables 43681 1727204734.04637: in VariableManager get_vars() 43681 1727204734.04649: Calling all_inventory to load vars for managed-node3 43681 1727204734.04652: Calling groups_inventory to load vars for managed-node3 43681 1727204734.04655: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204734.04659: Calling all_plugins_play to load vars for managed-node3 43681 1727204734.04661: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204734.04663: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.05873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.07461: done with get_vars() 43681 1727204734.07488: done queuing things up, now waiting for results queue to drain 43681 1727204734.07492: results queue empty 43681 1727204734.07492: checking for any_errors_fatal 43681 1727204734.07493: done checking for any_errors_fatal 43681 1727204734.07494: checking for max_fail_percentage 43681 1727204734.07495: done checking for max_fail_percentage 43681 1727204734.07495: checking to see if all hosts have failed and the running result is not ok 43681 1727204734.07496: done checking to see if all hosts have failed 43681 1727204734.07497: getting the remaining hosts for this loop 43681 1727204734.07497: done getting the remaining hosts for this loop 43681 1727204734.07500: getting the next task for host managed-node3 43681 1727204734.07503: done getting next task for host managed-node3 43681 1727204734.07504: ^ task is: TASK: meta (flush_handlers) 43681 1727204734.07505: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204734.07508: getting variables 43681 1727204734.07509: in VariableManager get_vars() 43681 1727204734.07517: Calling all_inventory to load vars for managed-node3 43681 1727204734.07519: Calling groups_inventory to load vars for managed-node3 43681 1727204734.07521: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204734.07526: Calling all_plugins_play to load vars for managed-node3 43681 1727204734.07528: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204734.07530: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.08687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.10270: done with get_vars() 43681 1727204734.10295: done getting variables 43681 1727204734.10338: in VariableManager get_vars() 43681 1727204734.10347: Calling all_inventory to load vars for managed-node3 43681 1727204734.10349: Calling groups_inventory to load vars for managed-node3 43681 1727204734.10351: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204734.10355: Calling all_plugins_play to load vars for managed-node3 43681 1727204734.10357: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204734.10359: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.11453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.13074: done with get_vars() 43681 1727204734.13103: done queuing things up, now waiting for results queue to drain 43681 1727204734.13105: results queue empty 43681 1727204734.13105: checking for any_errors_fatal 43681 1727204734.13106: done checking for any_errors_fatal 43681 1727204734.13107: checking for max_fail_percentage 43681 1727204734.13108: done checking for max_fail_percentage 43681 1727204734.13108: checking to see if all hosts have failed and the running result is not ok 43681 1727204734.13109: done checking to see if all hosts have failed 43681 1727204734.13109: getting the remaining hosts for this loop 43681 1727204734.13110: done getting the remaining hosts for this loop 43681 1727204734.13112: getting the next task for host managed-node3 43681 1727204734.13115: done getting next task for host managed-node3 43681 1727204734.13115: ^ task is: None 43681 1727204734.13117: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204734.13117: done queuing things up, now waiting for results queue to drain 43681 1727204734.13118: results queue empty 43681 1727204734.13119: checking for any_errors_fatal 43681 1727204734.13119: done checking for any_errors_fatal 43681 1727204734.13120: checking for max_fail_percentage 43681 1727204734.13120: done checking for max_fail_percentage 43681 1727204734.13121: checking to see if all hosts have failed and the running result is not ok 43681 1727204734.13122: done checking to see if all hosts have failed 43681 1727204734.13123: getting the next task for host managed-node3 43681 1727204734.13125: done getting next task for host managed-node3 43681 1727204734.13126: ^ task is: None 43681 1727204734.13127: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204734.13165: in VariableManager get_vars() 43681 1727204734.13177: done with get_vars() 43681 1727204734.13181: in VariableManager get_vars() 43681 1727204734.13188: done with get_vars() 43681 1727204734.13195: variable 'omit' from source: magic vars 43681 1727204734.13222: in VariableManager get_vars() 43681 1727204734.13230: done with get_vars() 43681 1727204734.13246: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 43681 1727204734.13393: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 43681 1727204734.13416: getting the remaining hosts for this loop 43681 1727204734.13417: done getting the remaining hosts for this loop 43681 1727204734.13419: getting the next task for host managed-node3 43681 1727204734.13425: done getting next task for host managed-node3 43681 1727204734.13427: ^ task is: TASK: Gathering Facts 43681 1727204734.13429: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204734.13430: getting variables 43681 1727204734.13431: in VariableManager get_vars() 43681 1727204734.13439: Calling all_inventory to load vars for managed-node3 43681 1727204734.13441: Calling groups_inventory to load vars for managed-node3 43681 1727204734.13443: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204734.13447: Calling all_plugins_play to load vars for managed-node3 43681 1727204734.13449: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204734.13451: Calling groups_plugins_play to load vars for managed-node3 43681 1727204734.14658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204734.16265: done with get_vars() 43681 1727204734.16285: done getting variables 43681 1727204734.16327: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Tuesday 24 September 2024 15:05:34 -0400 (0:00:00.539) 0:00:41.830 ***** 43681 1727204734.16349: entering _queue_task() for managed-node3/gather_facts 43681 1727204734.16624: worker is 1 (out of 1 available) 43681 1727204734.16638: exiting _queue_task() for managed-node3/gather_facts 43681 1727204734.16651: done queuing things up, now waiting for results queue to drain 43681 1727204734.16653: waiting for pending results... 43681 1727204734.16843: running TaskExecutor() for managed-node3/TASK: Gathering Facts 43681 1727204734.16924: in run() - task 12b410aa-8751-9e86-7728-00000000066a 43681 1727204734.16935: variable 'ansible_search_path' from source: unknown 43681 1727204734.16969: calling self._execute() 43681 1727204734.17059: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204734.17063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204734.17074: variable 'omit' from source: magic vars 43681 1727204734.17417: variable 'ansible_distribution_major_version' from source: facts 43681 1727204734.17433: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204734.17437: variable 'omit' from source: magic vars 43681 1727204734.17464: variable 'omit' from source: magic vars 43681 1727204734.17494: variable 'omit' from source: magic vars 43681 1727204734.17530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204734.17567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204734.17586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204734.17604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204734.17616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204734.17650: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204734.17655: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204734.17658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204734.17739: Set connection var ansible_shell_type to sh 43681 1727204734.17746: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204734.17755: Set connection var ansible_timeout to 10 43681 1727204734.17764: Set connection var ansible_pipelining to False 43681 1727204734.17770: Set connection var ansible_connection to ssh 43681 1727204734.17780: Set connection var ansible_shell_executable to /bin/sh 43681 1727204734.17802: variable 'ansible_shell_executable' from source: unknown 43681 1727204734.17805: variable 'ansible_connection' from source: unknown 43681 1727204734.17808: variable 'ansible_module_compression' from source: unknown 43681 1727204734.17812: variable 'ansible_shell_type' from source: unknown 43681 1727204734.17816: variable 'ansible_shell_executable' from source: unknown 43681 1727204734.17820: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204734.17827: variable 'ansible_pipelining' from source: unknown 43681 1727204734.17830: variable 'ansible_timeout' from source: unknown 43681 1727204734.17836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204734.17998: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204734.18010: variable 'omit' from source: magic vars 43681 1727204734.18016: starting attempt loop 43681 1727204734.18019: running the handler 43681 1727204734.18036: variable 'ansible_facts' from source: unknown 43681 1727204734.18054: _low_level_execute_command(): starting 43681 1727204734.18061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204734.18600: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204734.18619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.18682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204734.18694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204734.18697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204734.18732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204734.20469: stdout chunk (state=3): >>>/root <<< 43681 1727204734.20577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204734.20633: stderr chunk (state=3): >>><<< 43681 1727204734.20636: stdout chunk (state=3): >>><<< 43681 1727204734.20660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204734.20672: _low_level_execute_command(): starting 43681 1727204734.20679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630 `" && echo ansible-tmp-1727204734.2066004-45714-119925864227630="` echo /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630 `" ) && sleep 0' 43681 1727204734.21149: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204734.21154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204734.21157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204734.21168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.21216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204734.21220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204734.21263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204734.23259: stdout chunk (state=3): >>>ansible-tmp-1727204734.2066004-45714-119925864227630=/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630 <<< 43681 1727204734.23376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204734.23430: stderr chunk (state=3): >>><<< 43681 1727204734.23436: stdout chunk (state=3): >>><<< 43681 1727204734.23451: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204734.2066004-45714-119925864227630=/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204734.23477: variable 'ansible_module_compression' from source: unknown 43681 1727204734.23536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 43681 1727204734.23592: variable 'ansible_facts' from source: unknown 43681 1727204734.23739: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py 43681 1727204734.23865: Sending initial data 43681 1727204734.23868: Sent initial data (154 bytes) 43681 1727204734.24338: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204734.24341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.24344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204734.24347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.24402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204734.24406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204734.24446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204734.26054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204734.26058: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204734.26087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204734.26126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp7jn59aik /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py <<< 43681 1727204734.26129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py" <<< 43681 1727204734.26157: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp7jn59aik" to remote "/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py" <<< 43681 1727204734.27796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204734.27870: stderr chunk (state=3): >>><<< 43681 1727204734.27873: stdout chunk (state=3): >>><<< 43681 1727204734.27901: done transferring module to remote 43681 1727204734.27912: _low_level_execute_command(): starting 43681 1727204734.27919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/ /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py && sleep 0' 43681 1727204734.28398: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204734.28402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204734.28417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.28475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204734.28480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204734.28517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204734.30386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204734.30445: stderr chunk (state=3): >>><<< 43681 1727204734.30449: stdout chunk (state=3): >>><<< 43681 1727204734.30465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204734.30469: _low_level_execute_command(): starting 43681 1727204734.30475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/AnsiballZ_setup.py && sleep 0' 43681 1727204734.30959: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204734.30965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204734.30967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.30970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204734.30973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204734.31026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204734.31029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204734.31083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.00768: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.654296875, "5m": 0.81494140625, "15m": 0.54052734375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {<<< 43681 1727204735.00803: stdout chunk (state=3): >>>"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "34", "epoch": "1727204734", "epoch_int": "1727204734", "date": "2024-09-24", "time": "15:05:34", "iso8601_micro": "2024-09-24T19:05:34.626581Z", "iso8601": "2024-09-24T19:05:34Z", "iso8601_basic": "20240924T150534626581", "iso8601_basic_short": "20240924T150534", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3481, "used": 236}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize"<<< 43681 1727204735.00824: stdout chunk (state=3): >>>: "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1238, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139690496, "block_size": 4096, "block_total": 64479564, "block_available": 61313401, "block_used": 3166163, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_seg<<< 43681 1727204735.00844: stdout chunk (state=3): >>>mentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 43681 1727204735.02996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204735.03060: stderr chunk (state=3): >>><<< 43681 1727204735.03063: stdout chunk (state=3): >>><<< 43681 1727204735.03108: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.654296875, "5m": 0.81494140625, "15m": 0.54052734375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "34", "epoch": "1727204734", "epoch_int": "1727204734", "date": "2024-09-24", "time": "15:05:34", "iso8601_micro": "2024-09-24T19:05:34.626581Z", "iso8601": "2024-09-24T19:05:34Z", "iso8601_basic": "20240924T150534626581", "iso8601_basic_short": "20240924T150534", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3481, "used": 236}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1238, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251139690496, "block_size": 4096, "block_total": 64479564, "block_available": 61313401, "block_used": 3166163, "inode_total": 16384000, "inode_available": 16302070, "inode_used": 81930, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204735.03449: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204735.03470: _low_level_execute_command(): starting 43681 1727204735.03475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204734.2066004-45714-119925864227630/ > /dev/null 2>&1 && sleep 0' 43681 1727204735.03977: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204735.03981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.03984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204735.03986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.04044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.04048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.04050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.04104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.06055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.06107: stderr chunk (state=3): >>><<< 43681 1727204735.06110: stdout chunk (state=3): >>><<< 43681 1727204735.06127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204735.06137: handler run complete 43681 1727204735.06258: variable 'ansible_facts' from source: unknown 43681 1727204735.06357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.06640: variable 'ansible_facts' from source: unknown 43681 1727204735.06719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.06840: attempt loop complete, returning result 43681 1727204735.06845: _execute() done 43681 1727204735.06850: dumping result to json 43681 1727204735.06874: done dumping result, returning 43681 1727204735.06883: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-9e86-7728-00000000066a] 43681 1727204735.06888: sending task result for task 12b410aa-8751-9e86-7728-00000000066a 43681 1727204735.07241: done sending task result for task 12b410aa-8751-9e86-7728-00000000066a 43681 1727204735.07245: WORKER PROCESS EXITING ok: [managed-node3] 43681 1727204735.07557: no more pending results, returning what we have 43681 1727204735.07560: results queue empty 43681 1727204735.07561: checking for any_errors_fatal 43681 1727204735.07562: done checking for any_errors_fatal 43681 1727204735.07562: checking for max_fail_percentage 43681 1727204735.07564: done checking for max_fail_percentage 43681 1727204735.07564: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.07565: done checking to see if all hosts have failed 43681 1727204735.07566: getting the remaining hosts for this loop 43681 1727204735.07567: done getting the remaining hosts for this loop 43681 1727204735.07569: getting the next task for host managed-node3 43681 1727204735.07573: done getting next task for host managed-node3 43681 1727204735.07574: ^ task is: TASK: meta (flush_handlers) 43681 1727204735.07576: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.07579: getting variables 43681 1727204735.07580: in VariableManager get_vars() 43681 1727204735.07601: Calling all_inventory to load vars for managed-node3 43681 1727204735.07603: Calling groups_inventory to load vars for managed-node3 43681 1727204735.07605: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.07614: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.07616: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.07619: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.08956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.10598: done with get_vars() 43681 1727204735.10626: done getting variables 43681 1727204735.10688: in VariableManager get_vars() 43681 1727204735.10700: Calling all_inventory to load vars for managed-node3 43681 1727204735.10702: Calling groups_inventory to load vars for managed-node3 43681 1727204735.10704: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.10708: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.10710: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.10712: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.11925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.13538: done with get_vars() 43681 1727204735.13570: done queuing things up, now waiting for results queue to drain 43681 1727204735.13572: results queue empty 43681 1727204735.13573: checking for any_errors_fatal 43681 1727204735.13577: done checking for any_errors_fatal 43681 1727204735.13578: checking for max_fail_percentage 43681 1727204735.13578: done checking for max_fail_percentage 43681 1727204735.13579: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.13585: done checking to see if all hosts have failed 43681 1727204735.13586: getting the remaining hosts for this loop 43681 1727204735.13587: done getting the remaining hosts for this loop 43681 1727204735.13591: getting the next task for host managed-node3 43681 1727204735.13594: done getting next task for host managed-node3 43681 1727204735.13597: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 43681 1727204735.13598: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.13600: getting variables 43681 1727204735.13601: in VariableManager get_vars() 43681 1727204735.13610: Calling all_inventory to load vars for managed-node3 43681 1727204735.13613: Calling groups_inventory to load vars for managed-node3 43681 1727204735.13615: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.13623: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.13625: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.13628: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.14760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.16360: done with get_vars() 43681 1727204735.16382: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Tuesday 24 September 2024 15:05:35 -0400 (0:00:01.001) 0:00:42.831 ***** 43681 1727204735.16452: entering _queue_task() for managed-node3/include_tasks 43681 1727204735.16735: worker is 1 (out of 1 available) 43681 1727204735.16751: exiting _queue_task() for managed-node3/include_tasks 43681 1727204735.16767: done queuing things up, now waiting for results queue to drain 43681 1727204735.16769: waiting for pending results... 43681 1727204735.16961: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' 43681 1727204735.17047: in run() - task 12b410aa-8751-9e86-7728-0000000000a9 43681 1727204735.17061: variable 'ansible_search_path' from source: unknown 43681 1727204735.17095: calling self._execute() 43681 1727204735.17182: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.17191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.17201: variable 'omit' from source: magic vars 43681 1727204735.17545: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.17555: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.17561: _execute() done 43681 1727204735.17567: dumping result to json 43681 1727204735.17571: done dumping result, returning 43681 1727204735.17579: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' [12b410aa-8751-9e86-7728-0000000000a9] 43681 1727204735.17585: sending task result for task 12b410aa-8751-9e86-7728-0000000000a9 43681 1727204735.17690: done sending task result for task 12b410aa-8751-9e86-7728-0000000000a9 43681 1727204735.17693: WORKER PROCESS EXITING 43681 1727204735.17726: no more pending results, returning what we have 43681 1727204735.17731: in VariableManager get_vars() 43681 1727204735.17768: Calling all_inventory to load vars for managed-node3 43681 1727204735.17771: Calling groups_inventory to load vars for managed-node3 43681 1727204735.17775: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.17791: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.17795: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.17799: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.19187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.20812: done with get_vars() 43681 1727204735.20842: variable 'ansible_search_path' from source: unknown 43681 1727204735.20856: we have included files to process 43681 1727204735.20856: generating all_blocks data 43681 1727204735.20858: done generating all_blocks data 43681 1727204735.20858: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 43681 1727204735.20859: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 43681 1727204735.20861: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 43681 1727204735.21006: in VariableManager get_vars() 43681 1727204735.21023: done with get_vars() 43681 1727204735.21118: done processing included file 43681 1727204735.21120: iterating over new_blocks loaded from include file 43681 1727204735.21123: in VariableManager get_vars() 43681 1727204735.21133: done with get_vars() 43681 1727204735.21134: filtering new block on tags 43681 1727204735.21148: done filtering new block on tags 43681 1727204735.21150: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 43681 1727204735.21156: extending task lists for all hosts with included blocks 43681 1727204735.21212: done extending task lists 43681 1727204735.21213: done processing included files 43681 1727204735.21214: results queue empty 43681 1727204735.21214: checking for any_errors_fatal 43681 1727204735.21216: done checking for any_errors_fatal 43681 1727204735.21216: checking for max_fail_percentage 43681 1727204735.21217: done checking for max_fail_percentage 43681 1727204735.21218: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.21218: done checking to see if all hosts have failed 43681 1727204735.21219: getting the remaining hosts for this loop 43681 1727204735.21220: done getting the remaining hosts for this loop 43681 1727204735.21224: getting the next task for host managed-node3 43681 1727204735.21227: done getting next task for host managed-node3 43681 1727204735.21228: ^ task is: TASK: Include the task 'get_profile_stat.yml' 43681 1727204735.21230: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.21232: getting variables 43681 1727204735.21233: in VariableManager get_vars() 43681 1727204735.21240: Calling all_inventory to load vars for managed-node3 43681 1727204735.21242: Calling groups_inventory to load vars for managed-node3 43681 1727204735.21244: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.21249: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.21251: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.21253: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.22457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.24078: done with get_vars() 43681 1727204735.24107: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.077) 0:00:42.908 ***** 43681 1727204735.24175: entering _queue_task() for managed-node3/include_tasks 43681 1727204735.24461: worker is 1 (out of 1 available) 43681 1727204735.24476: exiting _queue_task() for managed-node3/include_tasks 43681 1727204735.24491: done queuing things up, now waiting for results queue to drain 43681 1727204735.24493: waiting for pending results... 43681 1727204735.24683: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 43681 1727204735.24783: in run() - task 12b410aa-8751-9e86-7728-00000000067b 43681 1727204735.24796: variable 'ansible_search_path' from source: unknown 43681 1727204735.24800: variable 'ansible_search_path' from source: unknown 43681 1727204735.24833: calling self._execute() 43681 1727204735.24924: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.24928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.24941: variable 'omit' from source: magic vars 43681 1727204735.25275: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.25286: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.25295: _execute() done 43681 1727204735.25299: dumping result to json 43681 1727204735.25304: done dumping result, returning 43681 1727204735.25310: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-9e86-7728-00000000067b] 43681 1727204735.25316: sending task result for task 12b410aa-8751-9e86-7728-00000000067b 43681 1727204735.25415: done sending task result for task 12b410aa-8751-9e86-7728-00000000067b 43681 1727204735.25418: WORKER PROCESS EXITING 43681 1727204735.25453: no more pending results, returning what we have 43681 1727204735.25459: in VariableManager get_vars() 43681 1727204735.25497: Calling all_inventory to load vars for managed-node3 43681 1727204735.25500: Calling groups_inventory to load vars for managed-node3 43681 1727204735.25504: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.25520: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.25526: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.25532: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.26835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.28451: done with get_vars() 43681 1727204735.28474: variable 'ansible_search_path' from source: unknown 43681 1727204735.28475: variable 'ansible_search_path' from source: unknown 43681 1727204735.28510: we have included files to process 43681 1727204735.28511: generating all_blocks data 43681 1727204735.28513: done generating all_blocks data 43681 1727204735.28514: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 43681 1727204735.28515: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 43681 1727204735.28517: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 43681 1727204735.29396: done processing included file 43681 1727204735.29398: iterating over new_blocks loaded from include file 43681 1727204735.29400: in VariableManager get_vars() 43681 1727204735.29414: done with get_vars() 43681 1727204735.29415: filtering new block on tags 43681 1727204735.29437: done filtering new block on tags 43681 1727204735.29439: in VariableManager get_vars() 43681 1727204735.29448: done with get_vars() 43681 1727204735.29449: filtering new block on tags 43681 1727204735.29465: done filtering new block on tags 43681 1727204735.29467: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 43681 1727204735.29471: extending task lists for all hosts with included blocks 43681 1727204735.29558: done extending task lists 43681 1727204735.29559: done processing included files 43681 1727204735.29560: results queue empty 43681 1727204735.29560: checking for any_errors_fatal 43681 1727204735.29563: done checking for any_errors_fatal 43681 1727204735.29564: checking for max_fail_percentage 43681 1727204735.29565: done checking for max_fail_percentage 43681 1727204735.29566: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.29566: done checking to see if all hosts have failed 43681 1727204735.29567: getting the remaining hosts for this loop 43681 1727204735.29568: done getting the remaining hosts for this loop 43681 1727204735.29570: getting the next task for host managed-node3 43681 1727204735.29573: done getting next task for host managed-node3 43681 1727204735.29575: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 43681 1727204735.29577: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.29579: getting variables 43681 1727204735.29580: in VariableManager get_vars() 43681 1727204735.29729: Calling all_inventory to load vars for managed-node3 43681 1727204735.29733: Calling groups_inventory to load vars for managed-node3 43681 1727204735.29736: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.29741: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.29743: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.29745: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.30828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.32439: done with get_vars() 43681 1727204735.32461: done getting variables 43681 1727204735.32499: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.083) 0:00:42.991 ***** 43681 1727204735.32526: entering _queue_task() for managed-node3/set_fact 43681 1727204735.32811: worker is 1 (out of 1 available) 43681 1727204735.32828: exiting _queue_task() for managed-node3/set_fact 43681 1727204735.32844: done queuing things up, now waiting for results queue to drain 43681 1727204735.32846: waiting for pending results... 43681 1727204735.33047: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 43681 1727204735.33148: in run() - task 12b410aa-8751-9e86-7728-00000000068a 43681 1727204735.33162: variable 'ansible_search_path' from source: unknown 43681 1727204735.33166: variable 'ansible_search_path' from source: unknown 43681 1727204735.33202: calling self._execute() 43681 1727204735.33286: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.33294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.33305: variable 'omit' from source: magic vars 43681 1727204735.33631: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.33642: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.33649: variable 'omit' from source: magic vars 43681 1727204735.33692: variable 'omit' from source: magic vars 43681 1727204735.33726: variable 'omit' from source: magic vars 43681 1727204735.33762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204735.33798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204735.33819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204735.33839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.33850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.33878: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204735.33881: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.33893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.33973: Set connection var ansible_shell_type to sh 43681 1727204735.33980: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204735.33987: Set connection var ansible_timeout to 10 43681 1727204735.33998: Set connection var ansible_pipelining to False 43681 1727204735.34005: Set connection var ansible_connection to ssh 43681 1727204735.34015: Set connection var ansible_shell_executable to /bin/sh 43681 1727204735.34039: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.34042: variable 'ansible_connection' from source: unknown 43681 1727204735.34046: variable 'ansible_module_compression' from source: unknown 43681 1727204735.34048: variable 'ansible_shell_type' from source: unknown 43681 1727204735.34051: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.34055: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.34063: variable 'ansible_pipelining' from source: unknown 43681 1727204735.34066: variable 'ansible_timeout' from source: unknown 43681 1727204735.34072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.34202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204735.34216: variable 'omit' from source: magic vars 43681 1727204735.34219: starting attempt loop 43681 1727204735.34224: running the handler 43681 1727204735.34239: handler run complete 43681 1727204735.34249: attempt loop complete, returning result 43681 1727204735.34252: _execute() done 43681 1727204735.34255: dumping result to json 43681 1727204735.34260: done dumping result, returning 43681 1727204735.34269: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-9e86-7728-00000000068a] 43681 1727204735.34276: sending task result for task 12b410aa-8751-9e86-7728-00000000068a 43681 1727204735.34369: done sending task result for task 12b410aa-8751-9e86-7728-00000000068a 43681 1727204735.34373: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 43681 1727204735.34447: no more pending results, returning what we have 43681 1727204735.34452: results queue empty 43681 1727204735.34453: checking for any_errors_fatal 43681 1727204735.34455: done checking for any_errors_fatal 43681 1727204735.34456: checking for max_fail_percentage 43681 1727204735.34458: done checking for max_fail_percentage 43681 1727204735.34459: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.34460: done checking to see if all hosts have failed 43681 1727204735.34461: getting the remaining hosts for this loop 43681 1727204735.34463: done getting the remaining hosts for this loop 43681 1727204735.34468: getting the next task for host managed-node3 43681 1727204735.34475: done getting next task for host managed-node3 43681 1727204735.34478: ^ task is: TASK: Stat profile file 43681 1727204735.34490: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.34495: getting variables 43681 1727204735.34497: in VariableManager get_vars() 43681 1727204735.34525: Calling all_inventory to load vars for managed-node3 43681 1727204735.34528: Calling groups_inventory to load vars for managed-node3 43681 1727204735.34532: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.34542: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.34546: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.34549: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.39237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.40836: done with get_vars() 43681 1727204735.40863: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.084) 0:00:43.076 ***** 43681 1727204735.40938: entering _queue_task() for managed-node3/stat 43681 1727204735.41219: worker is 1 (out of 1 available) 43681 1727204735.41233: exiting _queue_task() for managed-node3/stat 43681 1727204735.41248: done queuing things up, now waiting for results queue to drain 43681 1727204735.41252: waiting for pending results... 43681 1727204735.41455: running TaskExecutor() for managed-node3/TASK: Stat profile file 43681 1727204735.41564: in run() - task 12b410aa-8751-9e86-7728-00000000068b 43681 1727204735.41577: variable 'ansible_search_path' from source: unknown 43681 1727204735.41580: variable 'ansible_search_path' from source: unknown 43681 1727204735.41619: calling self._execute() 43681 1727204735.41710: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.41717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.41730: variable 'omit' from source: magic vars 43681 1727204735.42076: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.42086: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.42095: variable 'omit' from source: magic vars 43681 1727204735.42136: variable 'omit' from source: magic vars 43681 1727204735.42223: variable 'profile' from source: include params 43681 1727204735.42230: variable 'interface' from source: set_fact 43681 1727204735.42294: variable 'interface' from source: set_fact 43681 1727204735.42311: variable 'omit' from source: magic vars 43681 1727204735.42358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204735.42390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204735.42409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204735.42428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.42439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.42472: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204735.42475: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.42478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.42564: Set connection var ansible_shell_type to sh 43681 1727204735.42568: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204735.42577: Set connection var ansible_timeout to 10 43681 1727204735.42586: Set connection var ansible_pipelining to False 43681 1727204735.42595: Set connection var ansible_connection to ssh 43681 1727204735.42688: Set connection var ansible_shell_executable to /bin/sh 43681 1727204735.42694: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.42697: variable 'ansible_connection' from source: unknown 43681 1727204735.42701: variable 'ansible_module_compression' from source: unknown 43681 1727204735.42704: variable 'ansible_shell_type' from source: unknown 43681 1727204735.42707: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.42709: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.42712: variable 'ansible_pipelining' from source: unknown 43681 1727204735.42715: variable 'ansible_timeout' from source: unknown 43681 1727204735.42717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.42821: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204735.42835: variable 'omit' from source: magic vars 43681 1727204735.42842: starting attempt loop 43681 1727204735.42845: running the handler 43681 1727204735.42860: _low_level_execute_command(): starting 43681 1727204735.42867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204735.43430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204735.43434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.43440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204735.43444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.43497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.43506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.43508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.43547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.45324: stdout chunk (state=3): >>>/root <<< 43681 1727204735.45426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.45491: stderr chunk (state=3): >>><<< 43681 1727204735.45495: stdout chunk (state=3): >>><<< 43681 1727204735.45518: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204735.45532: _low_level_execute_command(): starting 43681 1727204735.45544: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558 `" && echo ansible-tmp-1727204735.455192-45732-184264022347558="` echo /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558 `" ) && sleep 0' 43681 1727204735.46037: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204735.46041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.46051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204735.46054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 43681 1727204735.46057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.46100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.46120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.46157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.48129: stdout chunk (state=3): >>>ansible-tmp-1727204735.455192-45732-184264022347558=/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558 <<< 43681 1727204735.48247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.48304: stderr chunk (state=3): >>><<< 43681 1727204735.48308: stdout chunk (state=3): >>><<< 43681 1727204735.48330: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204735.455192-45732-184264022347558=/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204735.48382: variable 'ansible_module_compression' from source: unknown 43681 1727204735.48437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 43681 1727204735.48476: variable 'ansible_facts' from source: unknown 43681 1727204735.48545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py 43681 1727204735.48667: Sending initial data 43681 1727204735.48671: Sent initial data (152 bytes) 43681 1727204735.49165: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204735.49168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.49172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204735.49174: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.49228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.49232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.49330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.49353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.50950: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 43681 1727204735.50953: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204735.50980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204735.51019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmprn8u25us /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py <<< 43681 1727204735.51024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py" <<< 43681 1727204735.51050: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmprn8u25us" to remote "/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py" <<< 43681 1727204735.52178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.52210: stderr chunk (state=3): >>><<< 43681 1727204735.52219: stdout chunk (state=3): >>><<< 43681 1727204735.52252: done transferring module to remote 43681 1727204735.52270: _low_level_execute_command(): starting 43681 1727204735.52280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/ /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py && sleep 0' 43681 1727204735.53005: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.53080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.53108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.53158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.53209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.55064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.55124: stderr chunk (state=3): >>><<< 43681 1727204735.55128: stdout chunk (state=3): >>><<< 43681 1727204735.55143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204735.55146: _low_level_execute_command(): starting 43681 1727204735.55153: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/AnsiballZ_stat.py && sleep 0' 43681 1727204735.55771: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.55849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.55857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.55908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.73108: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 43681 1727204735.74508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.74619: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 43681 1727204735.74658: stdout chunk (state=3): >>><<< 43681 1727204735.74662: stderr chunk (state=3): >>><<< 43681 1727204735.74683: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204735.74732: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204735.74766: _low_level_execute_command(): starting 43681 1727204735.74859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204735.455192-45732-184264022347558/ > /dev/null 2>&1 && sleep 0' 43681 1727204735.75485: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204735.75503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204735.75524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204735.75558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204735.75672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.75691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.75710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.75737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.75816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.77831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204735.77853: stdout chunk (state=3): >>><<< 43681 1727204735.77867: stderr chunk (state=3): >>><<< 43681 1727204735.77894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204735.77908: handler run complete 43681 1727204735.77957: attempt loop complete, returning result 43681 1727204735.78055: _execute() done 43681 1727204735.78058: dumping result to json 43681 1727204735.78061: done dumping result, returning 43681 1727204735.78063: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-9e86-7728-00000000068b] 43681 1727204735.78067: sending task result for task 12b410aa-8751-9e86-7728-00000000068b 43681 1727204735.78153: done sending task result for task 12b410aa-8751-9e86-7728-00000000068b 43681 1727204735.78157: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 43681 1727204735.78243: no more pending results, returning what we have 43681 1727204735.78247: results queue empty 43681 1727204735.78249: checking for any_errors_fatal 43681 1727204735.78260: done checking for any_errors_fatal 43681 1727204735.78261: checking for max_fail_percentage 43681 1727204735.78264: done checking for max_fail_percentage 43681 1727204735.78265: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.78268: done checking to see if all hosts have failed 43681 1727204735.78269: getting the remaining hosts for this loop 43681 1727204735.78271: done getting the remaining hosts for this loop 43681 1727204735.78276: getting the next task for host managed-node3 43681 1727204735.78284: done getting next task for host managed-node3 43681 1727204735.78287: ^ task is: TASK: Set NM profile exist flag based on the profile files 43681 1727204735.78293: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.78299: getting variables 43681 1727204735.78301: in VariableManager get_vars() 43681 1727204735.78339: Calling all_inventory to load vars for managed-node3 43681 1727204735.78343: Calling groups_inventory to load vars for managed-node3 43681 1727204735.78347: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.78362: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.78366: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.78370: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.81238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.84714: done with get_vars() 43681 1727204735.84769: done getting variables 43681 1727204735.84857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.439) 0:00:43.515 ***** 43681 1727204735.84904: entering _queue_task() for managed-node3/set_fact 43681 1727204735.85332: worker is 1 (out of 1 available) 43681 1727204735.85346: exiting _queue_task() for managed-node3/set_fact 43681 1727204735.85476: done queuing things up, now waiting for results queue to drain 43681 1727204735.85479: waiting for pending results... 43681 1727204735.85815: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 43681 1727204735.85910: in run() - task 12b410aa-8751-9e86-7728-00000000068c 43681 1727204735.85914: variable 'ansible_search_path' from source: unknown 43681 1727204735.85994: variable 'ansible_search_path' from source: unknown 43681 1727204735.85998: calling self._execute() 43681 1727204735.86112: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.86195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.86200: variable 'omit' from source: magic vars 43681 1727204735.86666: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.86697: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.86873: variable 'profile_stat' from source: set_fact 43681 1727204735.86911: Evaluated conditional (profile_stat.stat.exists): False 43681 1727204735.86920: when evaluation is False, skipping this task 43681 1727204735.86932: _execute() done 43681 1727204735.86940: dumping result to json 43681 1727204735.86994: done dumping result, returning 43681 1727204735.87001: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-9e86-7728-00000000068c] 43681 1727204735.87005: sending task result for task 12b410aa-8751-9e86-7728-00000000068c skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 43681 1727204735.87348: no more pending results, returning what we have 43681 1727204735.87352: results queue empty 43681 1727204735.87353: checking for any_errors_fatal 43681 1727204735.87362: done checking for any_errors_fatal 43681 1727204735.87363: checking for max_fail_percentage 43681 1727204735.87365: done checking for max_fail_percentage 43681 1727204735.87366: checking to see if all hosts have failed and the running result is not ok 43681 1727204735.87367: done checking to see if all hosts have failed 43681 1727204735.87368: getting the remaining hosts for this loop 43681 1727204735.87369: done getting the remaining hosts for this loop 43681 1727204735.87374: getting the next task for host managed-node3 43681 1727204735.87381: done getting next task for host managed-node3 43681 1727204735.87385: ^ task is: TASK: Get NM profile info 43681 1727204735.87392: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204735.87396: getting variables 43681 1727204735.87398: in VariableManager get_vars() 43681 1727204735.87440: Calling all_inventory to load vars for managed-node3 43681 1727204735.87443: Calling groups_inventory to load vars for managed-node3 43681 1727204735.87448: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204735.87461: Calling all_plugins_play to load vars for managed-node3 43681 1727204735.87465: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204735.87470: Calling groups_plugins_play to load vars for managed-node3 43681 1727204735.88006: done sending task result for task 12b410aa-8751-9e86-7728-00000000068c 43681 1727204735.88010: WORKER PROCESS EXITING 43681 1727204735.90284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204735.93698: done with get_vars() 43681 1727204735.93755: done getting variables 43681 1727204735.93878: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.090) 0:00:43.605 ***** 43681 1727204735.93919: entering _queue_task() for managed-node3/shell 43681 1727204735.93923: Creating lock for shell 43681 1727204735.94356: worker is 1 (out of 1 available) 43681 1727204735.94371: exiting _queue_task() for managed-node3/shell 43681 1727204735.94502: done queuing things up, now waiting for results queue to drain 43681 1727204735.94505: waiting for pending results... 43681 1727204735.94743: running TaskExecutor() for managed-node3/TASK: Get NM profile info 43681 1727204735.94925: in run() - task 12b410aa-8751-9e86-7728-00000000068d 43681 1727204735.94955: variable 'ansible_search_path' from source: unknown 43681 1727204735.94969: variable 'ansible_search_path' from source: unknown 43681 1727204735.95017: calling self._execute() 43681 1727204735.95156: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.95172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.95197: variable 'omit' from source: magic vars 43681 1727204735.95703: variable 'ansible_distribution_major_version' from source: facts 43681 1727204735.95729: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204735.95744: variable 'omit' from source: magic vars 43681 1727204735.95814: variable 'omit' from source: magic vars 43681 1727204735.96030: variable 'profile' from source: include params 43681 1727204735.96035: variable 'interface' from source: set_fact 43681 1727204735.96087: variable 'interface' from source: set_fact 43681 1727204735.96118: variable 'omit' from source: magic vars 43681 1727204735.96183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204735.96247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204735.96284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204735.96314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.96336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204735.96466: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204735.96470: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.96473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.96554: Set connection var ansible_shell_type to sh 43681 1727204735.96576: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204735.96591: Set connection var ansible_timeout to 10 43681 1727204735.96611: Set connection var ansible_pipelining to False 43681 1727204735.96624: Set connection var ansible_connection to ssh 43681 1727204735.96637: Set connection var ansible_shell_executable to /bin/sh 43681 1727204735.96670: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.96686: variable 'ansible_connection' from source: unknown 43681 1727204735.96697: variable 'ansible_module_compression' from source: unknown 43681 1727204735.96704: variable 'ansible_shell_type' from source: unknown 43681 1727204735.96716: variable 'ansible_shell_executable' from source: unknown 43681 1727204735.96727: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204735.96791: variable 'ansible_pipelining' from source: unknown 43681 1727204735.96797: variable 'ansible_timeout' from source: unknown 43681 1727204735.96799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204735.96951: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204735.96972: variable 'omit' from source: magic vars 43681 1727204735.96984: starting attempt loop 43681 1727204735.96994: running the handler 43681 1727204735.97026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204735.97095: _low_level_execute_command(): starting 43681 1727204735.97098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204735.97909: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204735.98002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204735.98059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204735.98078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204735.98116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204735.98210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204735.99974: stdout chunk (state=3): >>>/root <<< 43681 1727204736.00188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.00193: stdout chunk (state=3): >>><<< 43681 1727204736.00196: stderr chunk (state=3): >>><<< 43681 1727204736.00224: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.00342: _low_level_execute_command(): starting 43681 1727204736.00346: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090 `" && echo ansible-tmp-1727204736.0023215-45752-178260691714090="` echo /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090 `" ) && sleep 0' 43681 1727204736.00965: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204736.01015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.01146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204736.01169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.01249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.03217: stdout chunk (state=3): >>>ansible-tmp-1727204736.0023215-45752-178260691714090=/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090 <<< 43681 1727204736.03387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.03404: stdout chunk (state=3): >>><<< 43681 1727204736.03411: stderr chunk (state=3): >>><<< 43681 1727204736.03432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204736.0023215-45752-178260691714090=/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.03463: variable 'ansible_module_compression' from source: unknown 43681 1727204736.03516: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204736.03556: variable 'ansible_facts' from source: unknown 43681 1727204736.03626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py 43681 1727204736.03750: Sending initial data 43681 1727204736.03753: Sent initial data (156 bytes) 43681 1727204736.04219: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.04222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204736.04273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.04334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.04396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.06015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204736.06047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204736.06081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp1fpsnw5p /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py <<< 43681 1727204736.06100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py" <<< 43681 1727204736.06129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp1fpsnw5p" to remote "/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py" <<< 43681 1727204736.07302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.07363: stderr chunk (state=3): >>><<< 43681 1727204736.07366: stdout chunk (state=3): >>><<< 43681 1727204736.07388: done transferring module to remote 43681 1727204736.07407: _low_level_execute_command(): starting 43681 1727204736.07414: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/ /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py && sleep 0' 43681 1727204736.07868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204736.07872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.07875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204736.07877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.07935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204736.07938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.07973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.09779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.09827: stderr chunk (state=3): >>><<< 43681 1727204736.09831: stdout chunk (state=3): >>><<< 43681 1727204736.09845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.09851: _low_level_execute_command(): starting 43681 1727204736.09854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/AnsiballZ_command.py && sleep 0' 43681 1727204736.10303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.10306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.10309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.10311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.10367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204736.10372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.10418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.29387: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:05:36.274598", "end": "2024-09-24 15:05:36.292537", "delta": "0:00:00.017939", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204736.31059: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. <<< 43681 1727204736.31124: stderr chunk (state=3): >>><<< 43681 1727204736.31128: stdout chunk (state=3): >>><<< 43681 1727204736.31150: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:05:36.274598", "end": "2024-09-24 15:05:36.292537", "delta": "0:00:00.017939", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. 43681 1727204736.31185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204736.31199: _low_level_execute_command(): starting 43681 1727204736.31205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204736.0023215-45752-178260691714090/ > /dev/null 2>&1 && sleep 0' 43681 1727204736.31671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.31707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204736.31710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.31713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.31715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.31763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204736.31775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.31826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.33745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.33801: stderr chunk (state=3): >>><<< 43681 1727204736.33804: stdout chunk (state=3): >>><<< 43681 1727204736.33821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.33831: handler run complete 43681 1727204736.33856: Evaluated conditional (False): False 43681 1727204736.33866: attempt loop complete, returning result 43681 1727204736.33870: _execute() done 43681 1727204736.33874: dumping result to json 43681 1727204736.33880: done dumping result, returning 43681 1727204736.33892: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-9e86-7728-00000000068d] 43681 1727204736.33898: sending task result for task 12b410aa-8751-9e86-7728-00000000068d 43681 1727204736.34012: done sending task result for task 12b410aa-8751-9e86-7728-00000000068d 43681 1727204736.34015: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017939", "end": "2024-09-24 15:05:36.292537", "rc": 1, "start": "2024-09-24 15:05:36.274598" } MSG: non-zero return code ...ignoring 43681 1727204736.34109: no more pending results, returning what we have 43681 1727204736.34114: results queue empty 43681 1727204736.34115: checking for any_errors_fatal 43681 1727204736.34124: done checking for any_errors_fatal 43681 1727204736.34125: checking for max_fail_percentage 43681 1727204736.34127: done checking for max_fail_percentage 43681 1727204736.34128: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.34129: done checking to see if all hosts have failed 43681 1727204736.34130: getting the remaining hosts for this loop 43681 1727204736.34132: done getting the remaining hosts for this loop 43681 1727204736.34136: getting the next task for host managed-node3 43681 1727204736.34143: done getting next task for host managed-node3 43681 1727204736.34147: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 43681 1727204736.34151: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.34155: getting variables 43681 1727204736.34156: in VariableManager get_vars() 43681 1727204736.34187: Calling all_inventory to load vars for managed-node3 43681 1727204736.34192: Calling groups_inventory to load vars for managed-node3 43681 1727204736.34196: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.34215: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.34219: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.34223: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.35646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.37268: done with get_vars() 43681 1727204736.37296: done getting variables 43681 1727204736.37349: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.434) 0:00:44.040 ***** 43681 1727204736.37377: entering _queue_task() for managed-node3/set_fact 43681 1727204736.37657: worker is 1 (out of 1 available) 43681 1727204736.37671: exiting _queue_task() for managed-node3/set_fact 43681 1727204736.37685: done queuing things up, now waiting for results queue to drain 43681 1727204736.37687: waiting for pending results... 43681 1727204736.37894: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 43681 1727204736.37993: in run() - task 12b410aa-8751-9e86-7728-00000000068e 43681 1727204736.38006: variable 'ansible_search_path' from source: unknown 43681 1727204736.38009: variable 'ansible_search_path' from source: unknown 43681 1727204736.38050: calling self._execute() 43681 1727204736.38145: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.38153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.38165: variable 'omit' from source: magic vars 43681 1727204736.38507: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.38518: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.38638: variable 'nm_profile_exists' from source: set_fact 43681 1727204736.38652: Evaluated conditional (nm_profile_exists.rc == 0): False 43681 1727204736.38656: when evaluation is False, skipping this task 43681 1727204736.38659: _execute() done 43681 1727204736.38662: dumping result to json 43681 1727204736.38667: done dumping result, returning 43681 1727204736.38676: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-9e86-7728-00000000068e] 43681 1727204736.38679: sending task result for task 12b410aa-8751-9e86-7728-00000000068e 43681 1727204736.38774: done sending task result for task 12b410aa-8751-9e86-7728-00000000068e 43681 1727204736.38777: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 43681 1727204736.38841: no more pending results, returning what we have 43681 1727204736.38846: results queue empty 43681 1727204736.38847: checking for any_errors_fatal 43681 1727204736.38856: done checking for any_errors_fatal 43681 1727204736.38857: checking for max_fail_percentage 43681 1727204736.38859: done checking for max_fail_percentage 43681 1727204736.38860: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.38862: done checking to see if all hosts have failed 43681 1727204736.38863: getting the remaining hosts for this loop 43681 1727204736.38864: done getting the remaining hosts for this loop 43681 1727204736.38869: getting the next task for host managed-node3 43681 1727204736.38879: done getting next task for host managed-node3 43681 1727204736.38882: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 43681 1727204736.38886: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.38892: getting variables 43681 1727204736.38894: in VariableManager get_vars() 43681 1727204736.38924: Calling all_inventory to load vars for managed-node3 43681 1727204736.38926: Calling groups_inventory to load vars for managed-node3 43681 1727204736.38930: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.38942: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.38945: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.38949: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.40222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.41843: done with get_vars() 43681 1727204736.41869: done getting variables 43681 1727204736.41923: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204736.42028: variable 'profile' from source: include params 43681 1727204736.42032: variable 'interface' from source: set_fact 43681 1727204736.42086: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.047) 0:00:44.087 ***** 43681 1727204736.42114: entering _queue_task() for managed-node3/command 43681 1727204736.42383: worker is 1 (out of 1 available) 43681 1727204736.42399: exiting _queue_task() for managed-node3/command 43681 1727204736.42413: done queuing things up, now waiting for results queue to drain 43681 1727204736.42415: waiting for pending results... 43681 1727204736.42611: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 43681 1727204736.42709: in run() - task 12b410aa-8751-9e86-7728-000000000690 43681 1727204736.42721: variable 'ansible_search_path' from source: unknown 43681 1727204736.42727: variable 'ansible_search_path' from source: unknown 43681 1727204736.42765: calling self._execute() 43681 1727204736.42858: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.42871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.42875: variable 'omit' from source: magic vars 43681 1727204736.43196: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.43210: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.43321: variable 'profile_stat' from source: set_fact 43681 1727204736.43333: Evaluated conditional (profile_stat.stat.exists): False 43681 1727204736.43337: when evaluation is False, skipping this task 43681 1727204736.43340: _execute() done 43681 1727204736.43343: dumping result to json 43681 1727204736.43348: done dumping result, returning 43681 1727204736.43355: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-9e86-7728-000000000690] 43681 1727204736.43361: sending task result for task 12b410aa-8751-9e86-7728-000000000690 43681 1727204736.43455: done sending task result for task 12b410aa-8751-9e86-7728-000000000690 43681 1727204736.43458: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 43681 1727204736.43515: no more pending results, returning what we have 43681 1727204736.43520: results queue empty 43681 1727204736.43521: checking for any_errors_fatal 43681 1727204736.43529: done checking for any_errors_fatal 43681 1727204736.43530: checking for max_fail_percentage 43681 1727204736.43532: done checking for max_fail_percentage 43681 1727204736.43533: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.43534: done checking to see if all hosts have failed 43681 1727204736.43535: getting the remaining hosts for this loop 43681 1727204736.43537: done getting the remaining hosts for this loop 43681 1727204736.43542: getting the next task for host managed-node3 43681 1727204736.43549: done getting next task for host managed-node3 43681 1727204736.43552: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 43681 1727204736.43557: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.43561: getting variables 43681 1727204736.43563: in VariableManager get_vars() 43681 1727204736.43593: Calling all_inventory to load vars for managed-node3 43681 1727204736.43596: Calling groups_inventory to load vars for managed-node3 43681 1727204736.43600: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.43613: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.43616: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.43620: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.44995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.46613: done with get_vars() 43681 1727204736.46641: done getting variables 43681 1727204736.46692: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204736.46785: variable 'profile' from source: include params 43681 1727204736.46788: variable 'interface' from source: set_fact 43681 1727204736.46837: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.047) 0:00:44.135 ***** 43681 1727204736.46865: entering _queue_task() for managed-node3/set_fact 43681 1727204736.47133: worker is 1 (out of 1 available) 43681 1727204736.47146: exiting _queue_task() for managed-node3/set_fact 43681 1727204736.47162: done queuing things up, now waiting for results queue to drain 43681 1727204736.47164: waiting for pending results... 43681 1727204736.47364: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 43681 1727204736.47464: in run() - task 12b410aa-8751-9e86-7728-000000000691 43681 1727204736.47477: variable 'ansible_search_path' from source: unknown 43681 1727204736.47481: variable 'ansible_search_path' from source: unknown 43681 1727204736.47520: calling self._execute() 43681 1727204736.47614: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.47627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.47631: variable 'omit' from source: magic vars 43681 1727204736.47951: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.47963: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.48074: variable 'profile_stat' from source: set_fact 43681 1727204736.48087: Evaluated conditional (profile_stat.stat.exists): False 43681 1727204736.48092: when evaluation is False, skipping this task 43681 1727204736.48095: _execute() done 43681 1727204736.48100: dumping result to json 43681 1727204736.48103: done dumping result, returning 43681 1727204736.48111: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-9e86-7728-000000000691] 43681 1727204736.48117: sending task result for task 12b410aa-8751-9e86-7728-000000000691 43681 1727204736.48210: done sending task result for task 12b410aa-8751-9e86-7728-000000000691 43681 1727204736.48213: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 43681 1727204736.48267: no more pending results, returning what we have 43681 1727204736.48271: results queue empty 43681 1727204736.48272: checking for any_errors_fatal 43681 1727204736.48282: done checking for any_errors_fatal 43681 1727204736.48283: checking for max_fail_percentage 43681 1727204736.48285: done checking for max_fail_percentage 43681 1727204736.48286: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.48288: done checking to see if all hosts have failed 43681 1727204736.48290: getting the remaining hosts for this loop 43681 1727204736.48292: done getting the remaining hosts for this loop 43681 1727204736.48296: getting the next task for host managed-node3 43681 1727204736.48304: done getting next task for host managed-node3 43681 1727204736.48307: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 43681 1727204736.48311: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.48315: getting variables 43681 1727204736.48316: in VariableManager get_vars() 43681 1727204736.48349: Calling all_inventory to load vars for managed-node3 43681 1727204736.48352: Calling groups_inventory to load vars for managed-node3 43681 1727204736.48356: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.48369: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.48372: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.48376: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.49751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.51387: done with get_vars() 43681 1727204736.51416: done getting variables 43681 1727204736.51471: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204736.51570: variable 'profile' from source: include params 43681 1727204736.51574: variable 'interface' from source: set_fact 43681 1727204736.51624: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.047) 0:00:44.183 ***** 43681 1727204736.51650: entering _queue_task() for managed-node3/command 43681 1727204736.51928: worker is 1 (out of 1 available) 43681 1727204736.51942: exiting _queue_task() for managed-node3/command 43681 1727204736.51956: done queuing things up, now waiting for results queue to drain 43681 1727204736.51958: waiting for pending results... 43681 1727204736.52152: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 43681 1727204736.52255: in run() - task 12b410aa-8751-9e86-7728-000000000692 43681 1727204736.52267: variable 'ansible_search_path' from source: unknown 43681 1727204736.52270: variable 'ansible_search_path' from source: unknown 43681 1727204736.52306: calling self._execute() 43681 1727204736.52395: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.52408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.52417: variable 'omit' from source: magic vars 43681 1727204736.52730: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.52739: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.52852: variable 'profile_stat' from source: set_fact 43681 1727204736.52865: Evaluated conditional (profile_stat.stat.exists): False 43681 1727204736.52868: when evaluation is False, skipping this task 43681 1727204736.52871: _execute() done 43681 1727204736.52875: dumping result to json 43681 1727204736.52880: done dumping result, returning 43681 1727204736.52888: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-9e86-7728-000000000692] 43681 1727204736.52899: sending task result for task 12b410aa-8751-9e86-7728-000000000692 43681 1727204736.52993: done sending task result for task 12b410aa-8751-9e86-7728-000000000692 43681 1727204736.52996: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 43681 1727204736.53052: no more pending results, returning what we have 43681 1727204736.53057: results queue empty 43681 1727204736.53059: checking for any_errors_fatal 43681 1727204736.53065: done checking for any_errors_fatal 43681 1727204736.53066: checking for max_fail_percentage 43681 1727204736.53068: done checking for max_fail_percentage 43681 1727204736.53069: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.53071: done checking to see if all hosts have failed 43681 1727204736.53072: getting the remaining hosts for this loop 43681 1727204736.53073: done getting the remaining hosts for this loop 43681 1727204736.53078: getting the next task for host managed-node3 43681 1727204736.53085: done getting next task for host managed-node3 43681 1727204736.53088: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 43681 1727204736.53094: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.53099: getting variables 43681 1727204736.53100: in VariableManager get_vars() 43681 1727204736.53132: Calling all_inventory to load vars for managed-node3 43681 1727204736.53135: Calling groups_inventory to load vars for managed-node3 43681 1727204736.53139: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.53150: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.53155: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.53158: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.54423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.56046: done with get_vars() 43681 1727204736.56069: done getting variables 43681 1727204736.56119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204736.56210: variable 'profile' from source: include params 43681 1727204736.56213: variable 'interface' from source: set_fact 43681 1727204736.56263: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.046) 0:00:44.229 ***** 43681 1727204736.56288: entering _queue_task() for managed-node3/set_fact 43681 1727204736.56546: worker is 1 (out of 1 available) 43681 1727204736.56561: exiting _queue_task() for managed-node3/set_fact 43681 1727204736.56574: done queuing things up, now waiting for results queue to drain 43681 1727204736.56576: waiting for pending results... 43681 1727204736.56768: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 43681 1727204736.56863: in run() - task 12b410aa-8751-9e86-7728-000000000693 43681 1727204736.56876: variable 'ansible_search_path' from source: unknown 43681 1727204736.56879: variable 'ansible_search_path' from source: unknown 43681 1727204736.56916: calling self._execute() 43681 1727204736.57003: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.57010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.57023: variable 'omit' from source: magic vars 43681 1727204736.57339: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.57350: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.57455: variable 'profile_stat' from source: set_fact 43681 1727204736.57473: Evaluated conditional (profile_stat.stat.exists): False 43681 1727204736.57476: when evaluation is False, skipping this task 43681 1727204736.57479: _execute() done 43681 1727204736.57482: dumping result to json 43681 1727204736.57484: done dumping result, returning 43681 1727204736.57491: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-9e86-7728-000000000693] 43681 1727204736.57498: sending task result for task 12b410aa-8751-9e86-7728-000000000693 43681 1727204736.57590: done sending task result for task 12b410aa-8751-9e86-7728-000000000693 43681 1727204736.57593: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 43681 1727204736.57646: no more pending results, returning what we have 43681 1727204736.57650: results queue empty 43681 1727204736.57651: checking for any_errors_fatal 43681 1727204736.57659: done checking for any_errors_fatal 43681 1727204736.57660: checking for max_fail_percentage 43681 1727204736.57662: done checking for max_fail_percentage 43681 1727204736.57663: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.57665: done checking to see if all hosts have failed 43681 1727204736.57665: getting the remaining hosts for this loop 43681 1727204736.57667: done getting the remaining hosts for this loop 43681 1727204736.57672: getting the next task for host managed-node3 43681 1727204736.57680: done getting next task for host managed-node3 43681 1727204736.57683: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 43681 1727204736.57686: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.57692: getting variables 43681 1727204736.57694: in VariableManager get_vars() 43681 1727204736.57722: Calling all_inventory to load vars for managed-node3 43681 1727204736.57725: Calling groups_inventory to load vars for managed-node3 43681 1727204736.57728: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.57739: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.57742: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.57746: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.59118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.60727: done with get_vars() 43681 1727204736.60754: done getting variables 43681 1727204736.60807: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204736.60905: variable 'profile' from source: include params 43681 1727204736.60909: variable 'interface' from source: set_fact 43681 1727204736.60959: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.046) 0:00:44.276 ***** 43681 1727204736.60986: entering _queue_task() for managed-node3/assert 43681 1727204736.61260: worker is 1 (out of 1 available) 43681 1727204736.61276: exiting _queue_task() for managed-node3/assert 43681 1727204736.61291: done queuing things up, now waiting for results queue to drain 43681 1727204736.61293: waiting for pending results... 43681 1727204736.61722: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' 43681 1727204736.61760: in run() - task 12b410aa-8751-9e86-7728-00000000067c 43681 1727204736.61785: variable 'ansible_search_path' from source: unknown 43681 1727204736.61798: variable 'ansible_search_path' from source: unknown 43681 1727204736.61855: calling self._execute() 43681 1727204736.61985: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.62006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.62032: variable 'omit' from source: magic vars 43681 1727204736.62523: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.62546: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.62563: variable 'omit' from source: magic vars 43681 1727204736.62626: variable 'omit' from source: magic vars 43681 1727204736.62764: variable 'profile' from source: include params 43681 1727204736.62779: variable 'interface' from source: set_fact 43681 1727204736.62866: variable 'interface' from source: set_fact 43681 1727204736.62900: variable 'omit' from source: magic vars 43681 1727204736.63006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204736.63010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204736.63033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204736.63060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204736.63080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204736.63129: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204736.63141: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.63150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.63284: Set connection var ansible_shell_type to sh 43681 1727204736.63300: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204736.63332: Set connection var ansible_timeout to 10 43681 1727204736.63335: Set connection var ansible_pipelining to False 43681 1727204736.63345: Set connection var ansible_connection to ssh 43681 1727204736.63440: Set connection var ansible_shell_executable to /bin/sh 43681 1727204736.63443: variable 'ansible_shell_executable' from source: unknown 43681 1727204736.63446: variable 'ansible_connection' from source: unknown 43681 1727204736.63448: variable 'ansible_module_compression' from source: unknown 43681 1727204736.63450: variable 'ansible_shell_type' from source: unknown 43681 1727204736.63452: variable 'ansible_shell_executable' from source: unknown 43681 1727204736.63454: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.63457: variable 'ansible_pipelining' from source: unknown 43681 1727204736.63459: variable 'ansible_timeout' from source: unknown 43681 1727204736.63461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.63633: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204736.63660: variable 'omit' from source: magic vars 43681 1727204736.63675: starting attempt loop 43681 1727204736.63684: running the handler 43681 1727204736.63855: variable 'lsr_net_profile_exists' from source: set_fact 43681 1727204736.63872: Evaluated conditional (not lsr_net_profile_exists): True 43681 1727204736.63893: handler run complete 43681 1727204736.63921: attempt loop complete, returning result 43681 1727204736.63932: _execute() done 43681 1727204736.63988: dumping result to json 43681 1727204736.63995: done dumping result, returning 43681 1727204736.63998: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' [12b410aa-8751-9e86-7728-00000000067c] 43681 1727204736.64000: sending task result for task 12b410aa-8751-9e86-7728-00000000067c ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204736.64257: no more pending results, returning what we have 43681 1727204736.64263: results queue empty 43681 1727204736.64265: checking for any_errors_fatal 43681 1727204736.64273: done checking for any_errors_fatal 43681 1727204736.64274: checking for max_fail_percentage 43681 1727204736.64277: done checking for max_fail_percentage 43681 1727204736.64278: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.64279: done checking to see if all hosts have failed 43681 1727204736.64280: getting the remaining hosts for this loop 43681 1727204736.64282: done getting the remaining hosts for this loop 43681 1727204736.64288: getting the next task for host managed-node3 43681 1727204736.64301: done getting next task for host managed-node3 43681 1727204736.64307: ^ task is: TASK: Include the task 'assert_device_absent.yml' 43681 1727204736.64310: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.64317: getting variables 43681 1727204736.64319: in VariableManager get_vars() 43681 1727204736.64358: Calling all_inventory to load vars for managed-node3 43681 1727204736.64363: Calling groups_inventory to load vars for managed-node3 43681 1727204736.64368: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.64383: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.64388: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.64598: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.65308: done sending task result for task 12b410aa-8751-9e86-7728-00000000067c 43681 1727204736.65313: WORKER PROCESS EXITING 43681 1727204736.66779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.68412: done with get_vars() 43681 1727204736.68441: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.075) 0:00:44.351 ***** 43681 1727204736.68526: entering _queue_task() for managed-node3/include_tasks 43681 1727204736.68806: worker is 1 (out of 1 available) 43681 1727204736.68821: exiting _queue_task() for managed-node3/include_tasks 43681 1727204736.68835: done queuing things up, now waiting for results queue to drain 43681 1727204736.68838: waiting for pending results... 43681 1727204736.69043: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 43681 1727204736.69130: in run() - task 12b410aa-8751-9e86-7728-0000000000aa 43681 1727204736.69142: variable 'ansible_search_path' from source: unknown 43681 1727204736.69176: calling self._execute() 43681 1727204736.69289: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.69294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.69307: variable 'omit' from source: magic vars 43681 1727204736.69640: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.69651: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.69659: _execute() done 43681 1727204736.69663: dumping result to json 43681 1727204736.69668: done dumping result, returning 43681 1727204736.69676: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [12b410aa-8751-9e86-7728-0000000000aa] 43681 1727204736.69682: sending task result for task 12b410aa-8751-9e86-7728-0000000000aa 43681 1727204736.69779: done sending task result for task 12b410aa-8751-9e86-7728-0000000000aa 43681 1727204736.69782: WORKER PROCESS EXITING 43681 1727204736.69817: no more pending results, returning what we have 43681 1727204736.69823: in VariableManager get_vars() 43681 1727204736.69860: Calling all_inventory to load vars for managed-node3 43681 1727204736.69863: Calling groups_inventory to load vars for managed-node3 43681 1727204736.69867: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.69881: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.69885: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.69888: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.71172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.72775: done with get_vars() 43681 1727204736.72798: variable 'ansible_search_path' from source: unknown 43681 1727204736.72811: we have included files to process 43681 1727204736.72812: generating all_blocks data 43681 1727204736.72814: done generating all_blocks data 43681 1727204736.72820: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 43681 1727204736.72822: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 43681 1727204736.72825: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 43681 1727204736.72958: in VariableManager get_vars() 43681 1727204736.72971: done with get_vars() 43681 1727204736.73065: done processing included file 43681 1727204736.73067: iterating over new_blocks loaded from include file 43681 1727204736.73068: in VariableManager get_vars() 43681 1727204736.73076: done with get_vars() 43681 1727204736.73077: filtering new block on tags 43681 1727204736.73094: done filtering new block on tags 43681 1727204736.73095: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 43681 1727204736.73099: extending task lists for all hosts with included blocks 43681 1727204736.73233: done extending task lists 43681 1727204736.73234: done processing included files 43681 1727204736.73235: results queue empty 43681 1727204736.73235: checking for any_errors_fatal 43681 1727204736.73238: done checking for any_errors_fatal 43681 1727204736.73239: checking for max_fail_percentage 43681 1727204736.73240: done checking for max_fail_percentage 43681 1727204736.73240: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.73241: done checking to see if all hosts have failed 43681 1727204736.73242: getting the remaining hosts for this loop 43681 1727204736.73243: done getting the remaining hosts for this loop 43681 1727204736.73246: getting the next task for host managed-node3 43681 1727204736.73249: done getting next task for host managed-node3 43681 1727204736.73251: ^ task is: TASK: Include the task 'get_interface_stat.yml' 43681 1727204736.73253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.73255: getting variables 43681 1727204736.73256: in VariableManager get_vars() 43681 1727204736.73263: Calling all_inventory to load vars for managed-node3 43681 1727204736.73265: Calling groups_inventory to load vars for managed-node3 43681 1727204736.73267: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.73272: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.73274: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.73276: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.74697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.76285: done with get_vars() 43681 1727204736.76314: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.078) 0:00:44.430 ***** 43681 1727204736.76381: entering _queue_task() for managed-node3/include_tasks 43681 1727204736.76770: worker is 1 (out of 1 available) 43681 1727204736.76783: exiting _queue_task() for managed-node3/include_tasks 43681 1727204736.76798: done queuing things up, now waiting for results queue to drain 43681 1727204736.76800: waiting for pending results... 43681 1727204736.77212: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 43681 1727204736.77223: in run() - task 12b410aa-8751-9e86-7728-0000000006c4 43681 1727204736.77227: variable 'ansible_search_path' from source: unknown 43681 1727204736.77230: variable 'ansible_search_path' from source: unknown 43681 1727204736.77261: calling self._execute() 43681 1727204736.77387: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.77403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.77419: variable 'omit' from source: magic vars 43681 1727204736.77891: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.77987: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.77992: _execute() done 43681 1727204736.77996: dumping result to json 43681 1727204736.77998: done dumping result, returning 43681 1727204736.78000: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9e86-7728-0000000006c4] 43681 1727204736.78003: sending task result for task 12b410aa-8751-9e86-7728-0000000006c4 43681 1727204736.78076: done sending task result for task 12b410aa-8751-9e86-7728-0000000006c4 43681 1727204736.78080: WORKER PROCESS EXITING 43681 1727204736.78119: no more pending results, returning what we have 43681 1727204736.78125: in VariableManager get_vars() 43681 1727204736.78162: Calling all_inventory to load vars for managed-node3 43681 1727204736.78165: Calling groups_inventory to load vars for managed-node3 43681 1727204736.78169: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.78185: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.78188: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.78194: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.80831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.83156: done with get_vars() 43681 1727204736.83186: variable 'ansible_search_path' from source: unknown 43681 1727204736.83188: variable 'ansible_search_path' from source: unknown 43681 1727204736.83224: we have included files to process 43681 1727204736.83225: generating all_blocks data 43681 1727204736.83227: done generating all_blocks data 43681 1727204736.83228: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204736.83229: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204736.83230: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 43681 1727204736.83393: done processing included file 43681 1727204736.83395: iterating over new_blocks loaded from include file 43681 1727204736.83396: in VariableManager get_vars() 43681 1727204736.83409: done with get_vars() 43681 1727204736.83411: filtering new block on tags 43681 1727204736.83425: done filtering new block on tags 43681 1727204736.83427: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 43681 1727204736.83432: extending task lists for all hosts with included blocks 43681 1727204736.83517: done extending task lists 43681 1727204736.83519: done processing included files 43681 1727204736.83519: results queue empty 43681 1727204736.83520: checking for any_errors_fatal 43681 1727204736.83525: done checking for any_errors_fatal 43681 1727204736.83526: checking for max_fail_percentage 43681 1727204736.83527: done checking for max_fail_percentage 43681 1727204736.83527: checking to see if all hosts have failed and the running result is not ok 43681 1727204736.83528: done checking to see if all hosts have failed 43681 1727204736.83528: getting the remaining hosts for this loop 43681 1727204736.83529: done getting the remaining hosts for this loop 43681 1727204736.83532: getting the next task for host managed-node3 43681 1727204736.83535: done getting next task for host managed-node3 43681 1727204736.83536: ^ task is: TASK: Get stat for interface {{ interface }} 43681 1727204736.83539: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204736.83541: getting variables 43681 1727204736.83542: in VariableManager get_vars() 43681 1727204736.83549: Calling all_inventory to load vars for managed-node3 43681 1727204736.83551: Calling groups_inventory to load vars for managed-node3 43681 1727204736.83553: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204736.83558: Calling all_plugins_play to load vars for managed-node3 43681 1727204736.83560: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204736.83562: Calling groups_plugins_play to load vars for managed-node3 43681 1727204736.85188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204736.88413: done with get_vars() 43681 1727204736.88453: done getting variables 43681 1727204736.88651: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:05:36 -0400 (0:00:00.123) 0:00:44.553 ***** 43681 1727204736.88687: entering _queue_task() for managed-node3/stat 43681 1727204736.89093: worker is 1 (out of 1 available) 43681 1727204736.89108: exiting _queue_task() for managed-node3/stat 43681 1727204736.89125: done queuing things up, now waiting for results queue to drain 43681 1727204736.89127: waiting for pending results... 43681 1727204736.89513: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 43681 1727204736.89559: in run() - task 12b410aa-8751-9e86-7728-0000000006de 43681 1727204736.89585: variable 'ansible_search_path' from source: unknown 43681 1727204736.89590: variable 'ansible_search_path' from source: unknown 43681 1727204736.89632: calling self._execute() 43681 1727204736.89747: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.89849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.89854: variable 'omit' from source: magic vars 43681 1727204736.90235: variable 'ansible_distribution_major_version' from source: facts 43681 1727204736.90248: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204736.90257: variable 'omit' from source: magic vars 43681 1727204736.90320: variable 'omit' from source: magic vars 43681 1727204736.90460: variable 'interface' from source: set_fact 43681 1727204736.90482: variable 'omit' from source: magic vars 43681 1727204736.90530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204736.90583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204736.90614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204736.90633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204736.90651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204736.90691: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204736.90695: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.90725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.90841: Set connection var ansible_shell_type to sh 43681 1727204736.90994: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204736.90998: Set connection var ansible_timeout to 10 43681 1727204736.91000: Set connection var ansible_pipelining to False 43681 1727204736.91003: Set connection var ansible_connection to ssh 43681 1727204736.91005: Set connection var ansible_shell_executable to /bin/sh 43681 1727204736.91007: variable 'ansible_shell_executable' from source: unknown 43681 1727204736.91010: variable 'ansible_connection' from source: unknown 43681 1727204736.91012: variable 'ansible_module_compression' from source: unknown 43681 1727204736.91015: variable 'ansible_shell_type' from source: unknown 43681 1727204736.91017: variable 'ansible_shell_executable' from source: unknown 43681 1727204736.91019: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204736.91024: variable 'ansible_pipelining' from source: unknown 43681 1727204736.91026: variable 'ansible_timeout' from source: unknown 43681 1727204736.91028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204736.91210: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 43681 1727204736.91226: variable 'omit' from source: magic vars 43681 1727204736.91232: starting attempt loop 43681 1727204736.91235: running the handler 43681 1727204736.91251: _low_level_execute_command(): starting 43681 1727204736.91261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204736.92085: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204736.92197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.92219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204736.92232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204736.92253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.92330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.94133: stdout chunk (state=3): >>>/root <<< 43681 1727204736.94347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.94351: stdout chunk (state=3): >>><<< 43681 1727204736.94354: stderr chunk (state=3): >>><<< 43681 1727204736.94494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.94499: _low_level_execute_command(): starting 43681 1727204736.94502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206 `" && echo ansible-tmp-1727204736.9438105-45780-106668760301206="` echo /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206 `" ) && sleep 0' 43681 1727204736.95114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204736.95134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.95152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204736.95196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.95300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204736.95338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.95427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204736.97514: stdout chunk (state=3): >>>ansible-tmp-1727204736.9438105-45780-106668760301206=/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206 <<< 43681 1727204736.97802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204736.97806: stdout chunk (state=3): >>><<< 43681 1727204736.97808: stderr chunk (state=3): >>><<< 43681 1727204736.97811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204736.9438105-45780-106668760301206=/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204736.97814: variable 'ansible_module_compression' from source: unknown 43681 1727204736.97854: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 43681 1727204736.97904: variable 'ansible_facts' from source: unknown 43681 1727204736.98001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py 43681 1727204736.98261: Sending initial data 43681 1727204736.98265: Sent initial data (153 bytes) 43681 1727204736.98809: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204736.98813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204736.98815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204736.98818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204736.99048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204736.99052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204736.99054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204736.99057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204736.99059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.00680: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204737.00727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204737.00782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmp7gb6u287 /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py <<< 43681 1727204737.00786: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py" <<< 43681 1727204737.00828: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 43681 1727204737.00843: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmp7gb6u287" to remote "/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py" <<< 43681 1727204737.01904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.02014: stderr chunk (state=3): >>><<< 43681 1727204737.02025: stdout chunk (state=3): >>><<< 43681 1727204737.02050: done transferring module to remote 43681 1727204737.02072: _low_level_execute_command(): starting 43681 1727204737.02078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/ /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py && sleep 0' 43681 1727204737.02894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.02897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204737.02900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204737.02902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204737.02904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204737.02906: stderr chunk (state=3): >>>debug2: match not found <<< 43681 1727204737.02908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204737.02910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 43681 1727204737.02912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 43681 1727204737.02914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 43681 1727204737.02916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204737.02918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204737.02923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204737.02925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204737.02927: stderr chunk (state=3): >>>debug2: match found <<< 43681 1727204737.02934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204737.02945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204737.02975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.03034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.05009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.05067: stderr chunk (state=3): >>><<< 43681 1727204737.05083: stdout chunk (state=3): >>><<< 43681 1727204737.05116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204737.05224: _low_level_execute_command(): starting 43681 1727204737.05228: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/AnsiballZ_stat.py && sleep 0' 43681 1727204737.05816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.05876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.05912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.05972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.23080: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 43681 1727204737.24598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204737.24602: stdout chunk (state=3): >>><<< 43681 1727204737.24605: stderr chunk (state=3): >>><<< 43681 1727204737.24608: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204737.24634: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204737.24727: _low_level_execute_command(): starting 43681 1727204737.24731: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204736.9438105-45780-106668760301206/ > /dev/null 2>&1 && sleep 0' 43681 1727204737.25352: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.25367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204737.25404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 43681 1727204737.25515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204737.25538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.25563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.25658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.27647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.27654: stdout chunk (state=3): >>><<< 43681 1727204737.27657: stderr chunk (state=3): >>><<< 43681 1727204737.27680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204737.27694: handler run complete 43681 1727204737.27729: attempt loop complete, returning result 43681 1727204737.27737: _execute() done 43681 1727204737.27895: dumping result to json 43681 1727204737.27898: done dumping result, returning 43681 1727204737.27901: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [12b410aa-8751-9e86-7728-0000000006de] 43681 1727204737.27903: sending task result for task 12b410aa-8751-9e86-7728-0000000006de 43681 1727204737.27980: done sending task result for task 12b410aa-8751-9e86-7728-0000000006de 43681 1727204737.27983: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 43681 1727204737.28070: no more pending results, returning what we have 43681 1727204737.28074: results queue empty 43681 1727204737.28076: checking for any_errors_fatal 43681 1727204737.28077: done checking for any_errors_fatal 43681 1727204737.28078: checking for max_fail_percentage 43681 1727204737.28080: done checking for max_fail_percentage 43681 1727204737.28081: checking to see if all hosts have failed and the running result is not ok 43681 1727204737.28083: done checking to see if all hosts have failed 43681 1727204737.28084: getting the remaining hosts for this loop 43681 1727204737.28085: done getting the remaining hosts for this loop 43681 1727204737.28093: getting the next task for host managed-node3 43681 1727204737.28108: done getting next task for host managed-node3 43681 1727204737.28111: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 43681 1727204737.28115: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204737.28119: getting variables 43681 1727204737.28123: in VariableManager get_vars() 43681 1727204737.28158: Calling all_inventory to load vars for managed-node3 43681 1727204737.28161: Calling groups_inventory to load vars for managed-node3 43681 1727204737.28166: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204737.28179: Calling all_plugins_play to load vars for managed-node3 43681 1727204737.28183: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204737.28186: Calling groups_plugins_play to load vars for managed-node3 43681 1727204737.31064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204737.34471: done with get_vars() 43681 1727204737.34520: done getting variables 43681 1727204737.34595: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 43681 1727204737.34734: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:05:37 -0400 (0:00:00.460) 0:00:45.014 ***** 43681 1727204737.34773: entering _queue_task() for managed-node3/assert 43681 1727204737.35150: worker is 1 (out of 1 available) 43681 1727204737.35167: exiting _queue_task() for managed-node3/assert 43681 1727204737.35182: done queuing things up, now waiting for results queue to drain 43681 1727204737.35184: waiting for pending results... 43681 1727204737.35612: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' 43681 1727204737.35623: in run() - task 12b410aa-8751-9e86-7728-0000000006c5 43681 1727204737.35628: variable 'ansible_search_path' from source: unknown 43681 1727204737.35631: variable 'ansible_search_path' from source: unknown 43681 1727204737.35638: calling self._execute() 43681 1727204737.35766: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.35780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.35800: variable 'omit' from source: magic vars 43681 1727204737.36266: variable 'ansible_distribution_major_version' from source: facts 43681 1727204737.36293: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204737.36306: variable 'omit' from source: magic vars 43681 1727204737.36365: variable 'omit' from source: magic vars 43681 1727204737.36498: variable 'interface' from source: set_fact 43681 1727204737.36528: variable 'omit' from source: magic vars 43681 1727204737.36577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204737.36694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204737.36698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204737.36702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204737.36704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204737.36740: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204737.36826: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.36830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.36896: Set connection var ansible_shell_type to sh 43681 1727204737.36909: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204737.36920: Set connection var ansible_timeout to 10 43681 1727204737.36944: Set connection var ansible_pipelining to False 43681 1727204737.36956: Set connection var ansible_connection to ssh 43681 1727204737.36966: Set connection var ansible_shell_executable to /bin/sh 43681 1727204737.37000: variable 'ansible_shell_executable' from source: unknown 43681 1727204737.37009: variable 'ansible_connection' from source: unknown 43681 1727204737.37016: variable 'ansible_module_compression' from source: unknown 43681 1727204737.37027: variable 'ansible_shell_type' from source: unknown 43681 1727204737.37043: variable 'ansible_shell_executable' from source: unknown 43681 1727204737.37094: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.37098: variable 'ansible_pipelining' from source: unknown 43681 1727204737.37100: variable 'ansible_timeout' from source: unknown 43681 1727204737.37102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.37256: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204737.37276: variable 'omit' from source: magic vars 43681 1727204737.37287: starting attempt loop 43681 1727204737.37299: running the handler 43681 1727204737.37495: variable 'interface_stat' from source: set_fact 43681 1727204737.37513: Evaluated conditional (not interface_stat.stat.exists): True 43681 1727204737.37583: handler run complete 43681 1727204737.37591: attempt loop complete, returning result 43681 1727204737.37594: _execute() done 43681 1727204737.37597: dumping result to json 43681 1727204737.37599: done dumping result, returning 43681 1727204737.37601: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' [12b410aa-8751-9e86-7728-0000000006c5] 43681 1727204737.37604: sending task result for task 12b410aa-8751-9e86-7728-0000000006c5 43681 1727204737.37917: done sending task result for task 12b410aa-8751-9e86-7728-0000000006c5 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 43681 1727204737.37969: no more pending results, returning what we have 43681 1727204737.37973: results queue empty 43681 1727204737.37975: checking for any_errors_fatal 43681 1727204737.37983: done checking for any_errors_fatal 43681 1727204737.37983: checking for max_fail_percentage 43681 1727204737.37985: done checking for max_fail_percentage 43681 1727204737.37986: checking to see if all hosts have failed and the running result is not ok 43681 1727204737.37987: done checking to see if all hosts have failed 43681 1727204737.37988: getting the remaining hosts for this loop 43681 1727204737.37991: done getting the remaining hosts for this loop 43681 1727204737.37995: getting the next task for host managed-node3 43681 1727204737.38003: done getting next task for host managed-node3 43681 1727204737.38007: ^ task is: TASK: Verify network state restored to default 43681 1727204737.38009: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204737.38013: getting variables 43681 1727204737.38015: in VariableManager get_vars() 43681 1727204737.38043: Calling all_inventory to load vars for managed-node3 43681 1727204737.38046: Calling groups_inventory to load vars for managed-node3 43681 1727204737.38050: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204737.38057: WORKER PROCESS EXITING 43681 1727204737.38086: Calling all_plugins_play to load vars for managed-node3 43681 1727204737.38093: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204737.38098: Calling groups_plugins_play to load vars for managed-node3 43681 1727204737.40638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204737.43809: done with get_vars() 43681 1727204737.43850: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Tuesday 24 September 2024 15:05:37 -0400 (0:00:00.091) 0:00:45.106 ***** 43681 1727204737.43968: entering _queue_task() for managed-node3/include_tasks 43681 1727204737.44357: worker is 1 (out of 1 available) 43681 1727204737.44373: exiting _queue_task() for managed-node3/include_tasks 43681 1727204737.44386: done queuing things up, now waiting for results queue to drain 43681 1727204737.44388: waiting for pending results... 43681 1727204737.44725: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 43681 1727204737.44847: in run() - task 12b410aa-8751-9e86-7728-0000000000ab 43681 1727204737.44870: variable 'ansible_search_path' from source: unknown 43681 1727204737.44925: calling self._execute() 43681 1727204737.45055: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.45071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.45087: variable 'omit' from source: magic vars 43681 1727204737.45796: variable 'ansible_distribution_major_version' from source: facts 43681 1727204737.45801: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204737.45803: _execute() done 43681 1727204737.45806: dumping result to json 43681 1727204737.45809: done dumping result, returning 43681 1727204737.45811: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [12b410aa-8751-9e86-7728-0000000000ab] 43681 1727204737.45814: sending task result for task 12b410aa-8751-9e86-7728-0000000000ab 43681 1727204737.45894: done sending task result for task 12b410aa-8751-9e86-7728-0000000000ab 43681 1727204737.45898: WORKER PROCESS EXITING 43681 1727204737.45933: no more pending results, returning what we have 43681 1727204737.45939: in VariableManager get_vars() 43681 1727204737.45981: Calling all_inventory to load vars for managed-node3 43681 1727204737.45985: Calling groups_inventory to load vars for managed-node3 43681 1727204737.45991: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204737.46008: Calling all_plugins_play to load vars for managed-node3 43681 1727204737.46013: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204737.46017: Calling groups_plugins_play to load vars for managed-node3 43681 1727204737.48613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204737.51970: done with get_vars() 43681 1727204737.52005: variable 'ansible_search_path' from source: unknown 43681 1727204737.52027: we have included files to process 43681 1727204737.52029: generating all_blocks data 43681 1727204737.52031: done generating all_blocks data 43681 1727204737.52035: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 43681 1727204737.52037: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 43681 1727204737.52040: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 43681 1727204737.52635: done processing included file 43681 1727204737.52637: iterating over new_blocks loaded from include file 43681 1727204737.52639: in VariableManager get_vars() 43681 1727204737.52656: done with get_vars() 43681 1727204737.52658: filtering new block on tags 43681 1727204737.52681: done filtering new block on tags 43681 1727204737.52684: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 43681 1727204737.52695: extending task lists for all hosts with included blocks 43681 1727204737.53099: done extending task lists 43681 1727204737.53101: done processing included files 43681 1727204737.53102: results queue empty 43681 1727204737.53103: checking for any_errors_fatal 43681 1727204737.53108: done checking for any_errors_fatal 43681 1727204737.53109: checking for max_fail_percentage 43681 1727204737.53111: done checking for max_fail_percentage 43681 1727204737.53112: checking to see if all hosts have failed and the running result is not ok 43681 1727204737.53113: done checking to see if all hosts have failed 43681 1727204737.53115: getting the remaining hosts for this loop 43681 1727204737.53116: done getting the remaining hosts for this loop 43681 1727204737.53120: getting the next task for host managed-node3 43681 1727204737.53127: done getting next task for host managed-node3 43681 1727204737.53131: ^ task is: TASK: Check routes and DNS 43681 1727204737.53135: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204737.53138: getting variables 43681 1727204737.53140: in VariableManager get_vars() 43681 1727204737.53152: Calling all_inventory to load vars for managed-node3 43681 1727204737.53160: Calling groups_inventory to load vars for managed-node3 43681 1727204737.53164: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204737.53172: Calling all_plugins_play to load vars for managed-node3 43681 1727204737.53175: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204737.53179: Calling groups_plugins_play to load vars for managed-node3 43681 1727204737.60543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204737.63698: done with get_vars() 43681 1727204737.63744: done getting variables 43681 1727204737.63814: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:05:37 -0400 (0:00:00.198) 0:00:45.305 ***** 43681 1727204737.63849: entering _queue_task() for managed-node3/shell 43681 1727204737.64254: worker is 1 (out of 1 available) 43681 1727204737.64268: exiting _queue_task() for managed-node3/shell 43681 1727204737.64282: done queuing things up, now waiting for results queue to drain 43681 1727204737.64285: waiting for pending results... 43681 1727204737.64575: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 43681 1727204737.64607: in run() - task 12b410aa-8751-9e86-7728-0000000006f6 43681 1727204737.64669: variable 'ansible_search_path' from source: unknown 43681 1727204737.64673: variable 'ansible_search_path' from source: unknown 43681 1727204737.64676: calling self._execute() 43681 1727204737.64782: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.64794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.64805: variable 'omit' from source: magic vars 43681 1727204737.65295: variable 'ansible_distribution_major_version' from source: facts 43681 1727204737.65300: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204737.65302: variable 'omit' from source: magic vars 43681 1727204737.65317: variable 'omit' from source: magic vars 43681 1727204737.65366: variable 'omit' from source: magic vars 43681 1727204737.65412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 43681 1727204737.65455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 43681 1727204737.65481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 43681 1727204737.65506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204737.65549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 43681 1727204737.65557: variable 'inventory_hostname' from source: host vars for 'managed-node3' 43681 1727204737.65562: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.65568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.65769: Set connection var ansible_shell_type to sh 43681 1727204737.65773: Set connection var ansible_module_compression to ZIP_DEFLATED 43681 1727204737.65775: Set connection var ansible_timeout to 10 43681 1727204737.65778: Set connection var ansible_pipelining to False 43681 1727204737.65781: Set connection var ansible_connection to ssh 43681 1727204737.65784: Set connection var ansible_shell_executable to /bin/sh 43681 1727204737.65786: variable 'ansible_shell_executable' from source: unknown 43681 1727204737.65788: variable 'ansible_connection' from source: unknown 43681 1727204737.65793: variable 'ansible_module_compression' from source: unknown 43681 1727204737.65795: variable 'ansible_shell_type' from source: unknown 43681 1727204737.65797: variable 'ansible_shell_executable' from source: unknown 43681 1727204737.65800: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204737.65802: variable 'ansible_pipelining' from source: unknown 43681 1727204737.65805: variable 'ansible_timeout' from source: unknown 43681 1727204737.65807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204737.65963: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204737.65983: variable 'omit' from source: magic vars 43681 1727204737.65992: starting attempt loop 43681 1727204737.65995: running the handler 43681 1727204737.66002: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 43681 1727204737.66095: _low_level_execute_command(): starting 43681 1727204737.66100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 43681 1727204737.67224: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.67412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.67572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.69364: stdout chunk (state=3): >>>/root <<< 43681 1727204737.69472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.69655: stderr chunk (state=3): >>><<< 43681 1727204737.69659: stdout chunk (state=3): >>><<< 43681 1727204737.69687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204737.69704: _low_level_execute_command(): starting 43681 1727204737.69896: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567 `" && echo ansible-tmp-1727204737.6968718-45797-177444080375567="` echo /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567 `" ) && sleep 0' 43681 1727204737.71108: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204737.71325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.71338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.71405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.73412: stdout chunk (state=3): >>>ansible-tmp-1727204737.6968718-45797-177444080375567=/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567 <<< 43681 1727204737.73713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.73716: stdout chunk (state=3): >>><<< 43681 1727204737.73719: stderr chunk (state=3): >>><<< 43681 1727204737.73722: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204737.6968718-45797-177444080375567=/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204737.73724: variable 'ansible_module_compression' from source: unknown 43681 1727204737.73750: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-436817kzep24q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 43681 1727204737.73800: variable 'ansible_facts' from source: unknown 43681 1727204737.73896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py 43681 1727204737.74149: Sending initial data 43681 1727204737.74152: Sent initial data (156 bytes) 43681 1727204737.74693: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.74710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204737.74722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204737.74807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204737.74841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204737.74854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.74864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.74943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.76591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 43681 1727204737.76633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 43681 1727204737.76700: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-436817kzep24q/tmpqycal4ad /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py <<< 43681 1727204737.76704: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py" <<< 43681 1727204737.76754: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-436817kzep24q/tmpqycal4ad" to remote "/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py" <<< 43681 1727204737.77998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.78050: stderr chunk (state=3): >>><<< 43681 1727204737.78060: stdout chunk (state=3): >>><<< 43681 1727204737.78118: done transferring module to remote 43681 1727204737.78196: _low_level_execute_command(): starting 43681 1727204737.78202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/ /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py && sleep 0' 43681 1727204737.78954: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.79121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204737.79125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 43681 1727204737.79150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.79178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.79247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204737.81109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204737.81209: stderr chunk (state=3): >>><<< 43681 1727204737.81213: stdout chunk (state=3): >>><<< 43681 1727204737.81352: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204737.81356: _low_level_execute_command(): starting 43681 1727204737.81359: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/AnsiballZ_command.py && sleep 0' 43681 1727204737.81981: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204737.82002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204737.82017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204737.82047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 43681 1727204737.82065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 43681 1727204737.82116: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204737.82204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204737.82229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204737.82295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204738.00823: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2388sec preferred_lft 2388sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:37.997770", "end": "2024-09-24 15:05:38.006713", "delta": "0:00:00.008943", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 43681 1727204738.02796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 43681 1727204738.02801: stdout chunk (state=3): >>><<< 43681 1727204738.02803: stderr chunk (state=3): >>><<< 43681 1727204738.02806: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2388sec preferred_lft 2388sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:37.997770", "end": "2024-09-24 15:05:38.006713", "delta": "0:00:00.008943", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 43681 1727204738.02814: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 43681 1727204738.02818: _low_level_execute_command(): starting 43681 1727204738.02820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204737.6968718-45797-177444080375567/ > /dev/null 2>&1 && sleep 0' 43681 1727204738.03423: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 43681 1727204738.03437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 43681 1727204738.03456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 43681 1727204738.03510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 43681 1727204738.03586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 43681 1727204738.03610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 43681 1727204738.03620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 43681 1727204738.03686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 43681 1727204738.05809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 43681 1727204738.05812: stdout chunk (state=3): >>><<< 43681 1727204738.05815: stderr chunk (state=3): >>><<< 43681 1727204738.05817: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 43681 1727204738.05820: handler run complete 43681 1727204738.05822: Evaluated conditional (False): False 43681 1727204738.05824: attempt loop complete, returning result 43681 1727204738.05831: _execute() done 43681 1727204738.05834: dumping result to json 43681 1727204738.05836: done dumping result, returning 43681 1727204738.05838: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [12b410aa-8751-9e86-7728-0000000006f6] 43681 1727204738.05844: sending task result for task 12b410aa-8751-9e86-7728-0000000006f6 43681 1727204738.05986: done sending task result for task 12b410aa-8751-9e86-7728-0000000006f6 43681 1727204738.05990: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008943", "end": "2024-09-24 15:05:38.006713", "rc": 0, "start": "2024-09-24 15:05:37.997770" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2388sec preferred_lft 2388sec inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 43681 1727204738.06400: no more pending results, returning what we have 43681 1727204738.06404: results queue empty 43681 1727204738.06405: checking for any_errors_fatal 43681 1727204738.06407: done checking for any_errors_fatal 43681 1727204738.06408: checking for max_fail_percentage 43681 1727204738.06410: done checking for max_fail_percentage 43681 1727204738.06411: checking to see if all hosts have failed and the running result is not ok 43681 1727204738.06412: done checking to see if all hosts have failed 43681 1727204738.06413: getting the remaining hosts for this loop 43681 1727204738.06415: done getting the remaining hosts for this loop 43681 1727204738.06419: getting the next task for host managed-node3 43681 1727204738.06428: done getting next task for host managed-node3 43681 1727204738.06431: ^ task is: TASK: Verify DNS and network connectivity 43681 1727204738.06439: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204738.06444: getting variables 43681 1727204738.06445: in VariableManager get_vars() 43681 1727204738.06477: Calling all_inventory to load vars for managed-node3 43681 1727204738.06480: Calling groups_inventory to load vars for managed-node3 43681 1727204738.06484: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204738.06504: Calling all_plugins_play to load vars for managed-node3 43681 1727204738.06508: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204738.06513: Calling groups_plugins_play to load vars for managed-node3 43681 1727204738.09085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204738.12515: done with get_vars() 43681 1727204738.12561: done getting variables 43681 1727204738.12653: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:05:38 -0400 (0:00:00.488) 0:00:45.793 ***** 43681 1727204738.12695: entering _queue_task() for managed-node3/shell 43681 1727204738.13124: worker is 1 (out of 1 available) 43681 1727204738.13139: exiting _queue_task() for managed-node3/shell 43681 1727204738.13157: done queuing things up, now waiting for results queue to drain 43681 1727204738.13159: waiting for pending results... 43681 1727204738.13610: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 43681 1727204738.13615: in run() - task 12b410aa-8751-9e86-7728-0000000006f7 43681 1727204738.13644: variable 'ansible_search_path' from source: unknown 43681 1727204738.13647: variable 'ansible_search_path' from source: unknown 43681 1727204738.13688: calling self._execute() 43681 1727204738.13827: variable 'ansible_host' from source: host vars for 'managed-node3' 43681 1727204738.13842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 43681 1727204738.13862: variable 'omit' from source: magic vars 43681 1727204738.14349: variable 'ansible_distribution_major_version' from source: facts 43681 1727204738.14364: Evaluated conditional (ansible_distribution_major_version != '6'): True 43681 1727204738.14564: variable 'ansible_facts' from source: unknown 43681 1727204738.15869: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 43681 1727204738.15873: when evaluation is False, skipping this task 43681 1727204738.15877: _execute() done 43681 1727204738.15880: dumping result to json 43681 1727204738.15885: done dumping result, returning 43681 1727204738.15896: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [12b410aa-8751-9e86-7728-0000000006f7] 43681 1727204738.15907: sending task result for task 12b410aa-8751-9e86-7728-0000000006f7 43681 1727204738.16183: done sending task result for task 12b410aa-8751-9e86-7728-0000000006f7 43681 1727204738.16186: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 43681 1727204738.16347: no more pending results, returning what we have 43681 1727204738.16351: results queue empty 43681 1727204738.16352: checking for any_errors_fatal 43681 1727204738.16365: done checking for any_errors_fatal 43681 1727204738.16366: checking for max_fail_percentage 43681 1727204738.16368: done checking for max_fail_percentage 43681 1727204738.16369: checking to see if all hosts have failed and the running result is not ok 43681 1727204738.16370: done checking to see if all hosts have failed 43681 1727204738.16371: getting the remaining hosts for this loop 43681 1727204738.16373: done getting the remaining hosts for this loop 43681 1727204738.16377: getting the next task for host managed-node3 43681 1727204738.16386: done getting next task for host managed-node3 43681 1727204738.16388: ^ task is: TASK: meta (flush_handlers) 43681 1727204738.16393: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204738.16398: getting variables 43681 1727204738.16400: in VariableManager get_vars() 43681 1727204738.16434: Calling all_inventory to load vars for managed-node3 43681 1727204738.16437: Calling groups_inventory to load vars for managed-node3 43681 1727204738.16442: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204738.16453: Calling all_plugins_play to load vars for managed-node3 43681 1727204738.16457: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204738.16461: Calling groups_plugins_play to load vars for managed-node3 43681 1727204738.18812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204738.20698: done with get_vars() 43681 1727204738.20724: done getting variables 43681 1727204738.20785: in VariableManager get_vars() 43681 1727204738.20795: Calling all_inventory to load vars for managed-node3 43681 1727204738.20798: Calling groups_inventory to load vars for managed-node3 43681 1727204738.20800: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204738.20804: Calling all_plugins_play to load vars for managed-node3 43681 1727204738.20806: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204738.20808: Calling groups_plugins_play to load vars for managed-node3 43681 1727204738.22574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204738.24237: done with get_vars() 43681 1727204738.24280: done queuing things up, now waiting for results queue to drain 43681 1727204738.24283: results queue empty 43681 1727204738.24284: checking for any_errors_fatal 43681 1727204738.24287: done checking for any_errors_fatal 43681 1727204738.24288: checking for max_fail_percentage 43681 1727204738.24293: done checking for max_fail_percentage 43681 1727204738.24294: checking to see if all hosts have failed and the running result is not ok 43681 1727204738.24295: done checking to see if all hosts have failed 43681 1727204738.24296: getting the remaining hosts for this loop 43681 1727204738.24297: done getting the remaining hosts for this loop 43681 1727204738.24300: getting the next task for host managed-node3 43681 1727204738.24305: done getting next task for host managed-node3 43681 1727204738.24307: ^ task is: TASK: meta (flush_handlers) 43681 1727204738.24309: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204738.24312: getting variables 43681 1727204738.24313: in VariableManager get_vars() 43681 1727204738.24329: Calling all_inventory to load vars for managed-node3 43681 1727204738.24332: Calling groups_inventory to load vars for managed-node3 43681 1727204738.24335: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204738.24342: Calling all_plugins_play to load vars for managed-node3 43681 1727204738.24345: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204738.24349: Calling groups_plugins_play to load vars for managed-node3 43681 1727204738.26498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204738.29593: done with get_vars() 43681 1727204738.29633: done getting variables 43681 1727204738.29700: in VariableManager get_vars() 43681 1727204738.29713: Calling all_inventory to load vars for managed-node3 43681 1727204738.29716: Calling groups_inventory to load vars for managed-node3 43681 1727204738.29719: Calling all_plugins_inventory to load vars for managed-node3 43681 1727204738.29728: Calling all_plugins_play to load vars for managed-node3 43681 1727204738.29731: Calling groups_plugins_inventory to load vars for managed-node3 43681 1727204738.29735: Calling groups_plugins_play to load vars for managed-node3 43681 1727204738.31823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 43681 1727204738.35124: done with get_vars() 43681 1727204738.35168: done queuing things up, now waiting for results queue to drain 43681 1727204738.35171: results queue empty 43681 1727204738.35172: checking for any_errors_fatal 43681 1727204738.35174: done checking for any_errors_fatal 43681 1727204738.35175: checking for max_fail_percentage 43681 1727204738.35176: done checking for max_fail_percentage 43681 1727204738.35177: checking to see if all hosts have failed and the running result is not ok 43681 1727204738.35178: done checking to see if all hosts have failed 43681 1727204738.35179: getting the remaining hosts for this loop 43681 1727204738.35180: done getting the remaining hosts for this loop 43681 1727204738.35194: getting the next task for host managed-node3 43681 1727204738.35198: done getting next task for host managed-node3 43681 1727204738.35199: ^ task is: None 43681 1727204738.35201: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 43681 1727204738.35203: done queuing things up, now waiting for results queue to drain 43681 1727204738.35204: results queue empty 43681 1727204738.35205: checking for any_errors_fatal 43681 1727204738.35206: done checking for any_errors_fatal 43681 1727204738.35207: checking for max_fail_percentage 43681 1727204738.35208: done checking for max_fail_percentage 43681 1727204738.35209: checking to see if all hosts have failed and the running result is not ok 43681 1727204738.35210: done checking to see if all hosts have failed 43681 1727204738.35211: getting the next task for host managed-node3 43681 1727204738.35214: done getting next task for host managed-node3 43681 1727204738.35215: ^ task is: None 43681 1727204738.35217: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=86 changed=5 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Tuesday 24 September 2024 15:05:38 -0400 (0:00:00.226) 0:00:46.019 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.35s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.31s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.25s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.88s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.39s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.34s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Create veth interface ethtest0 ------------------------------------------ 1.25s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.25s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.83s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.82s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` ---------- 0.78s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.72s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 43681 1727204738.35363: RUNNING CLEANUP